Capstone Project :- Computer Vision Group 2¶
• DOMAIN: Automotive Surveillance.
• CONTEXT:
Computer vision can be used to automate supervision and generate action appropriate action trigger if the event is predicted from the image of interest. For example a car moving on the road can be easily identified by a camera as make of the car, type, colour, number plates etc.
• DATA DESCRIPTION:
The Cars dataset contains 16,185 images of 196 classes of cars. The data is split into 8,144 training images and 8,041 testing images, where each class has been split roughly in a 50-50 split. Classes are typically at the level of Make, Model, Year, e.g. 2012 Tesla Model S or 2012 BMW M3 coupe.
Data description:
‣ Train Images: Consists of real images of cars as per the make and year of the car.
‣ Test Images: Consists of real images of cars as per the make and year of the car.
‣ Train Annotation: Consists of bounding box region for training images.
‣ Test Annotation: Consists of bounding box region for testing images.
Dataset has been attached along with this project. Please use the same for this capstone project. Original link to the dataset for your reference only: https://www.kaggle.com/jutrera/stanford-car-dataset-by-classes-folder [ for your reference only ]
Reference: 3D Object Representations for Fine-Grained Categorisation, Jonathan Krause, Michael Stark, Jia Deng, Li Fei-Fei 4th IEEE Workshop on 3D Representation and Recognition, at ICCV 2013 (3dRR-13). Sydney, Australia. Dec. 8, 2013.
• PROJECT OBJECTIVE: Design a DL based car identification model.
• PROJECT TASK: [ Score: 100 points]
1. Milestone 1: [ Score: 40 points]
‣ Input: Context and Dataset
‣ Process:
‣ Step 1: Import the data. [ 3 points ]
‣ Step 2: Map training and testing images to its classes. [ 6 points ]
‣ Step 3: Map training and testing images to its annotations. [ 6 points ]
‣ Step 4: Display images with bounding box. [ 5 points ]
‣ Step 5: Design, train and test basic CNN models to classify the car. [ 10 points ]
‣ Step 6: Interim report [ 10 points ]
‣ Submission: Interim report, Jupyter Notebook with all the steps in Milestone-1
Milestone 1¶
Step 1: Importing the Libraries¶
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import os
import random
from os import path
# from google.colab import drive
import cv2
import random
import tensorflow as tf
C:\Users\LENOVO\anaconda3\lib\site-packages\scipy\__init__.py:155: UserWarning: A NumPy version >=1.18.5 and <1.25.0 is required for this version of SciPy (detected version 1.26.4
warnings.warn(f"A NumPy version >={np_minversion} and <{np_maxversion}"
# Check if the tensorflow is configured to utilise the GPU
tf.config.list_physical_devices("GPU")
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]
import os
os.environ["CUDA_VISIBLE_DEVICES"] = "0"
import tensorflow as tf
print("Num GPUs Available: ", len(tf.config.list_physical_devices('GPU')))
Num GPUs Available: 1
# To extend the GPU memory
import tensorflow as tf
gpus = tf.config.experimental.list_physical_devices('GPU')
tf.config.experimental.set_memory_growth(gpus[0], True)
# To clear GPU memory
# from numba import cuda
# device = cuda.get_current_device()
# device.reset()
# or
# import tensorflow as tf
# tf.keras.backend.clear_session()
Step 2: Data Extraction¶
# Defining the path at which the Annotations zip file is kept
zip_path = 'C:\\Users\\adity\\Downloads\\capstone\\Annotations.zip'
# Exporting the Annotations from the zip file
from zipfile import ZipFile
with ZipFile(zip_path,'r') as zip:
zip.extractall('C:\\Users\\adity\\Downloads\\capstone\\')
# Defining the path at which the zip file containing the images is kept
zip_path_2 = 'C:\\Users\\adity\\Downloads\\capstone\\Car+Images.zip'
# Exporting the images from the zip file
from zipfile import ZipFile
with ZipFile(zip_path_2,'r') as zip:
zip.extractall('C:\\Users\\adity\\Downloads\\capstone\\')
Step 3: Data Loading¶
# Step 1 & 2) Map the images from train folder with train and test labels to form a DataFrame. [6 Marks]
# Data Extraction
import os
import cv2
import pandas as pd
dataset = 'C:\\Users\\adity\\Downloads\\capstone\\Car Images\\'
X = [] # X will contain the image in the form of numpy N dimensional array
y = [] # y will contain the labels of the images
z = [] # z will tell if the image belongs to either the train or test dataset
h = [] # h will tell about the height of the image
w = [] # w will tell about the width of the image
c = [] # c will tell about the no of channels of the image
id = [] # id will be extracted from the image name to uniquely identify an image
path = [] # specifies the path of the image
count = 0
# list all folders inside train directory
for i in os.listdir(dataset):
print("Folder :-",i)
if i!= '.DS_Store': #Specific to Mac
for j in os.listdir(os.path.join(dataset, i)):
print("Sub Folder :-",j)
if j!= '.DS_Store': #Specific to Mac
for l in os.listdir(os.path.join(dataset, i, j)):
# read each image inside train directory one by one
k = os.path.splitext(l)
id.append(k[0])
dummy = cv2.imread(os.path.join(dataset, i, j, l))
# dummy = cv2.resize(dummy, (224, 224)) # Resizing can be any number - 128 is just kept for standard reference
X.append(dummy)
y.append(j)
z.append(i)
h.append(dummy.shape[0])
w.append(dummy.shape[1])
c.append(dummy.shape[2])
path.append(os.path.join(dataset, i, j, l))
count+=1
print("Count :-",count)
Folder :- .DS_Store Folder :- Test Images Sub Folder :- Acura Integra Type R 2001 Sub Folder :- Acura RL Sedan 2012 Sub Folder :- Acura TL Sedan 2012 Sub Folder :- Acura TL Type-S 2008 Sub Folder :- Acura TSX Sedan 2012 Sub Folder :- Acura ZDX Hatchback 2012 Sub Folder :- AM General Hummer SUV 2000 Sub Folder :- Aston Martin V8 Vantage Convertible 2012 Sub Folder :- Aston Martin V8 Vantage Coupe 2012 Sub Folder :- Aston Martin Virage Convertible 2012 Sub Folder :- Aston Martin Virage Coupe 2012 Sub Folder :- Audi 100 Sedan 1994 Sub Folder :- Audi 100 Wagon 1994 Sub Folder :- Audi A5 Coupe 2012 Sub Folder :- Audi R8 Coupe 2012 Sub Folder :- Audi RS 4 Convertible 2008 Sub Folder :- Audi S4 Sedan 2007 Sub Folder :- Audi S4 Sedan 2012 Sub Folder :- Audi S5 Convertible 2012 Sub Folder :- Audi S5 Coupe 2012 Sub Folder :- Audi S6 Sedan 2011 Sub Folder :- Audi TT Hatchback 2011 Sub Folder :- Audi TT RS Coupe 2012 Sub Folder :- Audi TTS Coupe 2012 Sub Folder :- Audi V8 Sedan 1994 Sub Folder :- Bentley Arnage Sedan 2009 Sub Folder :- Bentley Continental Flying Spur Sedan 2007 Sub Folder :- Bentley Continental GT Coupe 2007 Sub Folder :- Bentley Continental GT Coupe 2012 Sub Folder :- Bentley Continental Supersports Conv. Convertible 2012 Sub Folder :- Bentley Mulsanne Sedan 2011 Sub Folder :- BMW 1 Series Convertible 2012 Sub Folder :- BMW 1 Series Coupe 2012 Sub Folder :- BMW 3 Series Sedan 2012 Sub Folder :- BMW 3 Series Wagon 2012 Sub Folder :- BMW 6 Series Convertible 2007 Sub Folder :- BMW ActiveHybrid 5 Sedan 2012 Sub Folder :- BMW M3 Coupe 2012 Sub Folder :- BMW M5 Sedan 2010 Sub Folder :- BMW M6 Convertible 2010 Sub Folder :- BMW X3 SUV 2012 Sub Folder :- BMW X5 SUV 2007 Sub Folder :- BMW X6 SUV 2012 Sub Folder :- BMW Z4 Convertible 2012 Sub Folder :- Bugatti Veyron 16.4 Convertible 2009 Sub Folder :- Bugatti Veyron 16.4 Coupe 2009 Sub Folder :- Buick Enclave SUV 2012 Sub Folder :- Buick Rainier SUV 2007 Sub Folder :- Buick Regal GS 2012 Sub Folder :- Buick Verano Sedan 2012 Sub Folder :- Cadillac CTS-V Sedan 2012 Sub Folder :- Cadillac Escalade EXT Crew Cab 2007 Sub Folder :- Cadillac SRX SUV 2012 Sub Folder :- Chevrolet Avalanche Crew Cab 2012 Sub Folder :- Chevrolet Camaro Convertible 2012 Sub Folder :- Chevrolet Cobalt SS 2010 Sub Folder :- Chevrolet Corvette Convertible 2012 Sub Folder :- Chevrolet Corvette Ron Fellows Edition Z06 2007 Sub Folder :- Chevrolet Corvette ZR1 2012 Sub Folder :- Chevrolet Express Cargo Van 2007 Sub Folder :- Chevrolet Express Van 2007 Sub Folder :- Chevrolet HHR SS 2010 Sub Folder :- Chevrolet Impala Sedan 2007 Sub Folder :- Chevrolet Malibu Hybrid Sedan 2010 Sub Folder :- Chevrolet Malibu Sedan 2007 Sub Folder :- Chevrolet Monte Carlo Coupe 2007 Sub Folder :- Chevrolet Silverado 1500 Classic Extended Cab 2007 Sub Folder :- Chevrolet Silverado 1500 Extended Cab 2012 Sub Folder :- Chevrolet Silverado 1500 Hybrid Crew Cab 2012 Sub Folder :- Chevrolet Silverado 1500 Regular Cab 2012 Sub Folder :- Chevrolet Silverado 2500HD Regular Cab 2012 Sub Folder :- Chevrolet Sonic Sedan 2012 Sub Folder :- Chevrolet Tahoe Hybrid SUV 2012 Sub Folder :- Chevrolet TrailBlazer SS 2009 Sub Folder :- Chevrolet Traverse SUV 2012 Sub Folder :- Chrysler 300 SRT-8 2010 Sub Folder :- Chrysler Aspen SUV 2009 Sub Folder :- Chrysler Crossfire Convertible 2008 Sub Folder :- Chrysler PT Cruiser Convertible 2008 Sub Folder :- Chrysler Sebring Convertible 2010 Sub Folder :- Chrysler Town and Country Minivan 2012 Sub Folder :- Daewoo Nubira Wagon 2002 Sub Folder :- Dodge Caliber Wagon 2007 Sub Folder :- Dodge Caliber Wagon 2012 Sub Folder :- Dodge Caravan Minivan 1997 Sub Folder :- Dodge Challenger SRT8 2011 Sub Folder :- Dodge Charger Sedan 2012 Sub Folder :- Dodge Charger SRT-8 2009 Sub Folder :- Dodge Dakota Club Cab 2007 Sub Folder :- Dodge Dakota Crew Cab 2010 Sub Folder :- Dodge Durango SUV 2007 Sub Folder :- Dodge Durango SUV 2012 Sub Folder :- Dodge Journey SUV 2012 Sub Folder :- Dodge Magnum Wagon 2008 Sub Folder :- Dodge Ram Pickup 3500 Crew Cab 2010 Sub Folder :- Dodge Ram Pickup 3500 Quad Cab 2009 Sub Folder :- Dodge Sprinter Cargo Van 2009 Sub Folder :- Eagle Talon Hatchback 1998 Sub Folder :- Ferrari 458 Italia Convertible 2012 Sub Folder :- Ferrari 458 Italia Coupe 2012 Sub Folder :- Ferrari California Convertible 2012 Sub Folder :- Ferrari FF Coupe 2012 Sub Folder :- FIAT 500 Abarth 2012 Sub Folder :- FIAT 500 Convertible 2012 Sub Folder :- Fisker Karma Sedan 2012 Sub Folder :- Ford E-Series Wagon Van 2012 Sub Folder :- Ford Edge SUV 2012 Sub Folder :- Ford Expedition EL SUV 2009 Sub Folder :- Ford F-150 Regular Cab 2007 Sub Folder :- Ford F-150 Regular Cab 2012 Sub Folder :- Ford F-450 Super Duty Crew Cab 2012 Sub Folder :- Ford Fiesta Sedan 2012 Sub Folder :- Ford Focus Sedan 2007 Sub Folder :- Ford Freestar Minivan 2007 Sub Folder :- Ford GT Coupe 2006 Sub Folder :- Ford Mustang Convertible 2007 Sub Folder :- Ford Ranger SuperCab 2011 Sub Folder :- Geo Metro Convertible 1993 Sub Folder :- GMC Acadia SUV 2012 Sub Folder :- GMC Canyon Extended Cab 2012 Sub Folder :- GMC Savana Van 2012 Sub Folder :- GMC Terrain SUV 2012 Sub Folder :- GMC Yukon Hybrid SUV 2012 Sub Folder :- Honda Accord Coupe 2012 Sub Folder :- Honda Accord Sedan 2012 Sub Folder :- Honda Odyssey Minivan 2007 Sub Folder :- Honda Odyssey Minivan 2012 Sub Folder :- HUMMER H2 SUT Crew Cab 2009 Sub Folder :- HUMMER H3T Crew Cab 2010 Sub Folder :- Hyundai Accent Sedan 2012 Sub Folder :- Hyundai Azera Sedan 2012 Sub Folder :- Hyundai Elantra Sedan 2007 Sub Folder :- Hyundai Elantra Touring Hatchback 2012 Sub Folder :- Hyundai Genesis Sedan 2012 Sub Folder :- Hyundai Santa Fe SUV 2012 Sub Folder :- Hyundai Sonata Hybrid Sedan 2012 Sub Folder :- Hyundai Sonata Sedan 2012 Sub Folder :- Hyundai Tucson SUV 2012 Sub Folder :- Hyundai Veloster Hatchback 2012 Sub Folder :- Hyundai Veracruz SUV 2012 Sub Folder :- Infiniti G Coupe IPL 2012 Sub Folder :- Infiniti QX56 SUV 2011 Sub Folder :- Isuzu Ascender SUV 2008 Sub Folder :- Jaguar XK XKR 2012 Sub Folder :- Jeep Compass SUV 2012 Sub Folder :- Jeep Grand Cherokee SUV 2012 Sub Folder :- Jeep Liberty SUV 2012 Sub Folder :- Jeep Patriot SUV 2012 Sub Folder :- Jeep Wrangler SUV 2012 Sub Folder :- Lamborghini Aventador Coupe 2012 Sub Folder :- Lamborghini Diablo Coupe 2001 Sub Folder :- Lamborghini Gallardo LP 570-4 Superleggera 2012 Sub Folder :- Lamborghini Reventon Coupe 2008 Sub Folder :- Land Rover LR2 SUV 2012 Sub Folder :- Land Rover Range Rover SUV 2012 Sub Folder :- Lincoln Town Car Sedan 2011 Sub Folder :- Maybach Landaulet Convertible 2012 Sub Folder :- Mazda Tribute SUV 2011 Sub Folder :- McLaren MP4-12C Coupe 2012 Sub Folder :- Mercedes-Benz 300-Class Convertible 1993 Sub Folder :- Mercedes-Benz C-Class Sedan 2012 Sub Folder :- Mercedes-Benz E-Class Sedan 2012 Sub Folder :- Mercedes-Benz S-Class Sedan 2012 Sub Folder :- Mercedes-Benz SL-Class Coupe 2009 Sub Folder :- Mercedes-Benz Sprinter Van 2012 Sub Folder :- MINI Cooper Roadster Convertible 2012 Sub Folder :- Mitsubishi Lancer Sedan 2012 Sub Folder :- Nissan 240SX Coupe 1998 Sub Folder :- Nissan Juke Hatchback 2012 Sub Folder :- Nissan Leaf Hatchback 2012 Sub Folder :- Nissan NV Passenger Van 2012 Sub Folder :- Plymouth Neon Coupe 1999 Sub Folder :- Porsche Panamera Sedan 2012 Sub Folder :- Ram C-V Cargo Van Minivan 2012 Sub Folder :- Rolls-Royce Ghost Sedan 2012 Sub Folder :- Rolls-Royce Phantom Drophead Coupe Convertible 2012 Sub Folder :- Rolls-Royce Phantom Sedan 2012 Sub Folder :- Scion xD Hatchback 2012 Sub Folder :- smart fortwo Convertible 2012 Sub Folder :- Spyker C8 Convertible 2009 Sub Folder :- Spyker C8 Coupe 2009 Sub Folder :- Suzuki Aerio Sedan 2007 Sub Folder :- Suzuki Kizashi Sedan 2012 Sub Folder :- Suzuki SX4 Hatchback 2012 Sub Folder :- Suzuki SX4 Sedan 2012 Sub Folder :- Tesla Model S Sedan 2012 Sub Folder :- Toyota 4Runner SUV 2012 Sub Folder :- Toyota Camry Sedan 2012 Sub Folder :- Toyota Corolla Sedan 2012 Sub Folder :- Toyota Sequoia SUV 2012 Sub Folder :- Volkswagen Beetle Hatchback 2012 Sub Folder :- Volkswagen Golf Hatchback 1991 Sub Folder :- Volkswagen Golf Hatchback 2012 Sub Folder :- Volvo 240 Sedan 1993 Sub Folder :- Volvo C30 Hatchback 2012 Sub Folder :- Volvo XC90 SUV 2007 Folder :- Train Images Sub Folder :- .DS_Store Sub Folder :- Acura Integra Type R 2001 Sub Folder :- Acura RL Sedan 2012 Sub Folder :- Acura TL Sedan 2012 Sub Folder :- Acura TL Type-S 2008 Sub Folder :- Acura TSX Sedan 2012 Sub Folder :- Acura ZDX Hatchback 2012 Sub Folder :- AM General Hummer SUV 2000 Sub Folder :- Aston Martin V8 Vantage Convertible 2012 Sub Folder :- Aston Martin V8 Vantage Coupe 2012 Sub Folder :- Aston Martin Virage Convertible 2012 Sub Folder :- Aston Martin Virage Coupe 2012 Sub Folder :- Audi 100 Sedan 1994 Sub Folder :- Audi 100 Wagon 1994 Sub Folder :- Audi A5 Coupe 2012 Sub Folder :- Audi R8 Coupe 2012 Sub Folder :- Audi RS 4 Convertible 2008 Sub Folder :- Audi S4 Sedan 2007 Sub Folder :- Audi S4 Sedan 2012 Sub Folder :- Audi S5 Convertible 2012 Sub Folder :- Audi S5 Coupe 2012 Sub Folder :- Audi S6 Sedan 2011 Sub Folder :- Audi TT Hatchback 2011 Sub Folder :- Audi TT RS Coupe 2012 Sub Folder :- Audi TTS Coupe 2012 Sub Folder :- Audi V8 Sedan 1994 Sub Folder :- Bentley Arnage Sedan 2009 Sub Folder :- Bentley Continental Flying Spur Sedan 2007 Sub Folder :- Bentley Continental GT Coupe 2007 Sub Folder :- Bentley Continental GT Coupe 2012 Sub Folder :- Bentley Continental Supersports Conv. Convertible 2012 Sub Folder :- Bentley Mulsanne Sedan 2011 Sub Folder :- BMW 1 Series Convertible 2012 Sub Folder :- BMW 1 Series Coupe 2012 Sub Folder :- BMW 3 Series Sedan 2012 Sub Folder :- BMW 3 Series Wagon 2012 Sub Folder :- BMW 6 Series Convertible 2007 Sub Folder :- BMW ActiveHybrid 5 Sedan 2012 Sub Folder :- BMW M3 Coupe 2012 Sub Folder :- BMW M5 Sedan 2010 Sub Folder :- BMW M6 Convertible 2010 Sub Folder :- BMW X3 SUV 2012 Sub Folder :- BMW X5 SUV 2007 Sub Folder :- BMW X6 SUV 2012 Sub Folder :- BMW Z4 Convertible 2012 Sub Folder :- Bugatti Veyron 16.4 Convertible 2009 Sub Folder :- Bugatti Veyron 16.4 Coupe 2009 Sub Folder :- Buick Enclave SUV 2012 Sub Folder :- Buick Rainier SUV 2007 Sub Folder :- Buick Regal GS 2012 Sub Folder :- Buick Verano Sedan 2012 Sub Folder :- Cadillac CTS-V Sedan 2012 Sub Folder :- Cadillac Escalade EXT Crew Cab 2007 Sub Folder :- Cadillac SRX SUV 2012 Sub Folder :- Chevrolet Avalanche Crew Cab 2012 Sub Folder :- Chevrolet Camaro Convertible 2012 Sub Folder :- Chevrolet Cobalt SS 2010 Sub Folder :- Chevrolet Corvette Convertible 2012 Sub Folder :- Chevrolet Corvette Ron Fellows Edition Z06 2007 Sub Folder :- Chevrolet Corvette ZR1 2012 Sub Folder :- Chevrolet Express Cargo Van 2007 Sub Folder :- Chevrolet Express Van 2007 Sub Folder :- Chevrolet HHR SS 2010 Sub Folder :- Chevrolet Impala Sedan 2007 Sub Folder :- Chevrolet Malibu Hybrid Sedan 2010 Sub Folder :- Chevrolet Malibu Sedan 2007 Sub Folder :- Chevrolet Monte Carlo Coupe 2007 Sub Folder :- Chevrolet Silverado 1500 Classic Extended Cab 2007 Sub Folder :- Chevrolet Silverado 1500 Extended Cab 2012 Sub Folder :- Chevrolet Silverado 1500 Hybrid Crew Cab 2012 Sub Folder :- Chevrolet Silverado 1500 Regular Cab 2012 Sub Folder :- Chevrolet Silverado 2500HD Regular Cab 2012 Sub Folder :- Chevrolet Sonic Sedan 2012 Sub Folder :- Chevrolet Tahoe Hybrid SUV 2012 Sub Folder :- Chevrolet TrailBlazer SS 2009 Sub Folder :- Chevrolet Traverse SUV 2012 Sub Folder :- Chrysler 300 SRT-8 2010 Sub Folder :- Chrysler Aspen SUV 2009 Sub Folder :- Chrysler Crossfire Convertible 2008 Sub Folder :- Chrysler PT Cruiser Convertible 2008 Sub Folder :- Chrysler Sebring Convertible 2010 Sub Folder :- Chrysler Town and Country Minivan 2012 Sub Folder :- Daewoo Nubira Wagon 2002 Sub Folder :- Dodge Caliber Wagon 2007 Sub Folder :- Dodge Caliber Wagon 2012 Sub Folder :- Dodge Caravan Minivan 1997 Sub Folder :- Dodge Challenger SRT8 2011 Sub Folder :- Dodge Charger Sedan 2012 Sub Folder :- Dodge Charger SRT-8 2009 Sub Folder :- Dodge Dakota Club Cab 2007 Sub Folder :- Dodge Dakota Crew Cab 2010 Sub Folder :- Dodge Durango SUV 2007 Sub Folder :- Dodge Durango SUV 2012 Sub Folder :- Dodge Journey SUV 2012 Sub Folder :- Dodge Magnum Wagon 2008 Sub Folder :- Dodge Ram Pickup 3500 Crew Cab 2010 Sub Folder :- Dodge Ram Pickup 3500 Quad Cab 2009 Sub Folder :- Dodge Sprinter Cargo Van 2009 Sub Folder :- Eagle Talon Hatchback 1998 Sub Folder :- Ferrari 458 Italia Convertible 2012 Sub Folder :- Ferrari 458 Italia Coupe 2012 Sub Folder :- Ferrari California Convertible 2012 Sub Folder :- Ferrari FF Coupe 2012 Sub Folder :- FIAT 500 Abarth 2012 Sub Folder :- FIAT 500 Convertible 2012 Sub Folder :- Fisker Karma Sedan 2012 Sub Folder :- Ford E-Series Wagon Van 2012 Sub Folder :- Ford Edge SUV 2012 Sub Folder :- Ford Expedition EL SUV 2009 Sub Folder :- Ford F-150 Regular Cab 2007 Sub Folder :- Ford F-150 Regular Cab 2012 Sub Folder :- Ford F-450 Super Duty Crew Cab 2012 Sub Folder :- Ford Fiesta Sedan 2012 Sub Folder :- Ford Focus Sedan 2007 Sub Folder :- Ford Freestar Minivan 2007 Sub Folder :- Ford GT Coupe 2006 Sub Folder :- Ford Mustang Convertible 2007 Sub Folder :- Ford Ranger SuperCab 2011 Sub Folder :- Geo Metro Convertible 1993 Sub Folder :- GMC Acadia SUV 2012 Sub Folder :- GMC Canyon Extended Cab 2012 Sub Folder :- GMC Savana Van 2012 Sub Folder :- GMC Terrain SUV 2012 Sub Folder :- GMC Yukon Hybrid SUV 2012 Sub Folder :- Honda Accord Coupe 2012 Sub Folder :- Honda Accord Sedan 2012 Sub Folder :- Honda Odyssey Minivan 2007 Sub Folder :- Honda Odyssey Minivan 2012 Sub Folder :- HUMMER H2 SUT Crew Cab 2009 Sub Folder :- HUMMER H3T Crew Cab 2010 Sub Folder :- Hyundai Accent Sedan 2012 Sub Folder :- Hyundai Azera Sedan 2012 Sub Folder :- Hyundai Elantra Sedan 2007 Sub Folder :- Hyundai Elantra Touring Hatchback 2012 Sub Folder :- Hyundai Genesis Sedan 2012 Sub Folder :- Hyundai Santa Fe SUV 2012 Sub Folder :- Hyundai Sonata Hybrid Sedan 2012 Sub Folder :- Hyundai Sonata Sedan 2012 Sub Folder :- Hyundai Tucson SUV 2012 Sub Folder :- Hyundai Veloster Hatchback 2012 Sub Folder :- Hyundai Veracruz SUV 2012 Sub Folder :- Infiniti G Coupe IPL 2012 Sub Folder :- Infiniti QX56 SUV 2011 Sub Folder :- Isuzu Ascender SUV 2008 Sub Folder :- Jaguar XK XKR 2012 Sub Folder :- Jeep Compass SUV 2012 Sub Folder :- Jeep Grand Cherokee SUV 2012 Sub Folder :- Jeep Liberty SUV 2012 Sub Folder :- Jeep Patriot SUV 2012 Sub Folder :- Jeep Wrangler SUV 2012 Sub Folder :- Lamborghini Aventador Coupe 2012 Sub Folder :- Lamborghini Diablo Coupe 2001 Sub Folder :- Lamborghini Gallardo LP 570-4 Superleggera 2012 Sub Folder :- Lamborghini Reventon Coupe 2008 Sub Folder :- Land Rover LR2 SUV 2012 Sub Folder :- Land Rover Range Rover SUV 2012 Sub Folder :- Lincoln Town Car Sedan 2011 Sub Folder :- Maybach Landaulet Convertible 2012 Sub Folder :- Mazda Tribute SUV 2011 Sub Folder :- McLaren MP4-12C Coupe 2012 Sub Folder :- Mercedes-Benz 300-Class Convertible 1993 Sub Folder :- Mercedes-Benz C-Class Sedan 2012 Sub Folder :- Mercedes-Benz E-Class Sedan 2012 Sub Folder :- Mercedes-Benz S-Class Sedan 2012 Sub Folder :- Mercedes-Benz SL-Class Coupe 2009 Sub Folder :- Mercedes-Benz Sprinter Van 2012 Sub Folder :- MINI Cooper Roadster Convertible 2012 Sub Folder :- Mitsubishi Lancer Sedan 2012 Sub Folder :- Nissan 240SX Coupe 1998 Sub Folder :- Nissan Juke Hatchback 2012 Sub Folder :- Nissan Leaf Hatchback 2012 Sub Folder :- Nissan NV Passenger Van 2012 Sub Folder :- Plymouth Neon Coupe 1999 Sub Folder :- Porsche Panamera Sedan 2012 Sub Folder :- Ram C-V Cargo Van Minivan 2012 Sub Folder :- Rolls-Royce Ghost Sedan 2012 Sub Folder :- Rolls-Royce Phantom Drophead Coupe Convertible 2012 Sub Folder :- Rolls-Royce Phantom Sedan 2012 Sub Folder :- Scion xD Hatchback 2012 Sub Folder :- smart fortwo Convertible 2012 Sub Folder :- Spyker C8 Convertible 2009 Sub Folder :- Spyker C8 Coupe 2009 Sub Folder :- Suzuki Aerio Sedan 2007 Sub Folder :- Suzuki Kizashi Sedan 2012 Sub Folder :- Suzuki SX4 Hatchback 2012 Sub Folder :- Suzuki SX4 Sedan 2012 Sub Folder :- Tesla Model S Sedan 2012 Sub Folder :- Toyota 4Runner SUV 2012 Sub Folder :- Toyota Camry Sedan 2012 Sub Folder :- Toyota Corolla Sedan 2012 Sub Folder :- Toyota Sequoia SUV 2012 Sub Folder :- Volkswagen Beetle Hatchback 2012 Sub Folder :- Volkswagen Golf Hatchback 1991 Sub Folder :- Volkswagen Golf Hatchback 2012 Sub Folder :- Volvo 240 Sedan 1993 Sub Folder :- Volvo C30 Hatchback 2012 Sub Folder :- Volvo XC90 SUV 2007 Count :- 16185
# Creating a Dataframe consisting of information about the images
data_df = pd.DataFrame({'id':id, 'image':X, 'label':y, 'dataset':z, 'height':h, 'width':w, 'n_channels' :c, 'path' :path})
data_df.head()
| id | image | label | dataset | height | width | n_channels | path | |
|---|---|---|---|---|---|---|---|---|
| 0 | 00128 | [[[110, 143, 122], [91, 124, 103], [79, 115, 9... | Acura Integra Type R 2001 | Test Images | 600 | 900 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... |
| 1 | 00130 | [[[96, 94, 94], [97, 95, 95], [99, 97, 97], [9... | Acura Integra Type R 2001 | Test Images | 458 | 800 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... |
| 2 | 00386 | [[[0, 0, 0], [0, 0, 0], [4, 4, 4], [0, 0, 0], ... | Acura Integra Type R 2001 | Test Images | 533 | 800 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... |
| 3 | 00565 | [[[253, 253, 253], [253, 253, 253], [253, 253,... | Acura Integra Type R 2001 | Test Images | 380 | 545 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... |
| 4 | 00711 | [[[95, 79, 72], [95, 79, 72], [95, 79, 72], [9... | Acura Integra Type R 2001 | Test Images | 409 | 799 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... |
# Creating test and train datasets
test_df = data_df[data_df['dataset']=='Test Images']
train_df = data_df[data_df['dataset']=='Train Images']
Step 4: Basic Data Analysis¶
print("No of classes in Train :-",len(next(os.walk('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Train Images\\'))[1]))
print("No of classes in Test :-",len(next(os.walk('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\'))[1]))
No of classes in Train :- 196 No of classes in Test :- 196
train_df['label'].nunique()
196
test_df['label'].nunique()
196
train_df['label'].unique() == test_df['label'].unique()
array([ True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True, True, True,
True, True, True, True, True, True, True])
- Observations:-
-> The above statement checks if the unique values in the training match with the unique values in the test
# Extracting the unique label names of the cars from the train dataset
unique_labels = list(train_df['label'].unique())
unique_labels
['Acura Integra Type R 2001', 'Acura RL Sedan 2012', 'Acura TL Sedan 2012', 'Acura TL Type-S 2008', 'Acura TSX Sedan 2012', 'Acura ZDX Hatchback 2012', 'AM General Hummer SUV 2000', 'Aston Martin V8 Vantage Convertible 2012', 'Aston Martin V8 Vantage Coupe 2012', 'Aston Martin Virage Convertible 2012', 'Aston Martin Virage Coupe 2012', 'Audi 100 Sedan 1994', 'Audi 100 Wagon 1994', 'Audi A5 Coupe 2012', 'Audi R8 Coupe 2012', 'Audi RS 4 Convertible 2008', 'Audi S4 Sedan 2007', 'Audi S4 Sedan 2012', 'Audi S5 Convertible 2012', 'Audi S5 Coupe 2012', 'Audi S6 Sedan 2011', 'Audi TT Hatchback 2011', 'Audi TT RS Coupe 2012', 'Audi TTS Coupe 2012', 'Audi V8 Sedan 1994', 'Bentley Arnage Sedan 2009', 'Bentley Continental Flying Spur Sedan 2007', 'Bentley Continental GT Coupe 2007', 'Bentley Continental GT Coupe 2012', 'Bentley Continental Supersports Conv. Convertible 2012', 'Bentley Mulsanne Sedan 2011', 'BMW 1 Series Convertible 2012', 'BMW 1 Series Coupe 2012', 'BMW 3 Series Sedan 2012', 'BMW 3 Series Wagon 2012', 'BMW 6 Series Convertible 2007', 'BMW ActiveHybrid 5 Sedan 2012', 'BMW M3 Coupe 2012', 'BMW M5 Sedan 2010', 'BMW M6 Convertible 2010', 'BMW X3 SUV 2012', 'BMW X5 SUV 2007', 'BMW X6 SUV 2012', 'BMW Z4 Convertible 2012', 'Bugatti Veyron 16.4 Convertible 2009', 'Bugatti Veyron 16.4 Coupe 2009', 'Buick Enclave SUV 2012', 'Buick Rainier SUV 2007', 'Buick Regal GS 2012', 'Buick Verano Sedan 2012', 'Cadillac CTS-V Sedan 2012', 'Cadillac Escalade EXT Crew Cab 2007', 'Cadillac SRX SUV 2012', 'Chevrolet Avalanche Crew Cab 2012', 'Chevrolet Camaro Convertible 2012', 'Chevrolet Cobalt SS 2010', 'Chevrolet Corvette Convertible 2012', 'Chevrolet Corvette Ron Fellows Edition Z06 2007', 'Chevrolet Corvette ZR1 2012', 'Chevrolet Express Cargo Van 2007', 'Chevrolet Express Van 2007', 'Chevrolet HHR SS 2010', 'Chevrolet Impala Sedan 2007', 'Chevrolet Malibu Hybrid Sedan 2010', 'Chevrolet Malibu Sedan 2007', 'Chevrolet Monte Carlo Coupe 2007', 'Chevrolet Silverado 1500 Classic Extended Cab 2007', 'Chevrolet Silverado 1500 Extended Cab 2012', 'Chevrolet Silverado 1500 Hybrid Crew Cab 2012', 'Chevrolet Silverado 1500 Regular Cab 2012', 'Chevrolet Silverado 2500HD Regular Cab 2012', 'Chevrolet Sonic Sedan 2012', 'Chevrolet Tahoe Hybrid SUV 2012', 'Chevrolet TrailBlazer SS 2009', 'Chevrolet Traverse SUV 2012', 'Chrysler 300 SRT-8 2010', 'Chrysler Aspen SUV 2009', 'Chrysler Crossfire Convertible 2008', 'Chrysler PT Cruiser Convertible 2008', 'Chrysler Sebring Convertible 2010', 'Chrysler Town and Country Minivan 2012', 'Daewoo Nubira Wagon 2002', 'Dodge Caliber Wagon 2007', 'Dodge Caliber Wagon 2012', 'Dodge Caravan Minivan 1997', 'Dodge Challenger SRT8 2011', 'Dodge Charger Sedan 2012', 'Dodge Charger SRT-8 2009', 'Dodge Dakota Club Cab 2007', 'Dodge Dakota Crew Cab 2010', 'Dodge Durango SUV 2007', 'Dodge Durango SUV 2012', 'Dodge Journey SUV 2012', 'Dodge Magnum Wagon 2008', 'Dodge Ram Pickup 3500 Crew Cab 2010', 'Dodge Ram Pickup 3500 Quad Cab 2009', 'Dodge Sprinter Cargo Van 2009', 'Eagle Talon Hatchback 1998', 'Ferrari 458 Italia Convertible 2012', 'Ferrari 458 Italia Coupe 2012', 'Ferrari California Convertible 2012', 'Ferrari FF Coupe 2012', 'FIAT 500 Abarth 2012', 'FIAT 500 Convertible 2012', 'Fisker Karma Sedan 2012', 'Ford E-Series Wagon Van 2012', 'Ford Edge SUV 2012', 'Ford Expedition EL SUV 2009', 'Ford F-150 Regular Cab 2007', 'Ford F-150 Regular Cab 2012', 'Ford F-450 Super Duty Crew Cab 2012', 'Ford Fiesta Sedan 2012', 'Ford Focus Sedan 2007', 'Ford Freestar Minivan 2007', 'Ford GT Coupe 2006', 'Ford Mustang Convertible 2007', 'Ford Ranger SuperCab 2011', 'Geo Metro Convertible 1993', 'GMC Acadia SUV 2012', 'GMC Canyon Extended Cab 2012', 'GMC Savana Van 2012', 'GMC Terrain SUV 2012', 'GMC Yukon Hybrid SUV 2012', 'Honda Accord Coupe 2012', 'Honda Accord Sedan 2012', 'Honda Odyssey Minivan 2007', 'Honda Odyssey Minivan 2012', 'HUMMER H2 SUT Crew Cab 2009', 'HUMMER H3T Crew Cab 2010', 'Hyundai Accent Sedan 2012', 'Hyundai Azera Sedan 2012', 'Hyundai Elantra Sedan 2007', 'Hyundai Elantra Touring Hatchback 2012', 'Hyundai Genesis Sedan 2012', 'Hyundai Santa Fe SUV 2012', 'Hyundai Sonata Hybrid Sedan 2012', 'Hyundai Sonata Sedan 2012', 'Hyundai Tucson SUV 2012', 'Hyundai Veloster Hatchback 2012', 'Hyundai Veracruz SUV 2012', 'Infiniti G Coupe IPL 2012', 'Infiniti QX56 SUV 2011', 'Isuzu Ascender SUV 2008', 'Jaguar XK XKR 2012', 'Jeep Compass SUV 2012', 'Jeep Grand Cherokee SUV 2012', 'Jeep Liberty SUV 2012', 'Jeep Patriot SUV 2012', 'Jeep Wrangler SUV 2012', 'Lamborghini Aventador Coupe 2012', 'Lamborghini Diablo Coupe 2001', 'Lamborghini Gallardo LP 570-4 Superleggera 2012', 'Lamborghini Reventon Coupe 2008', 'Land Rover LR2 SUV 2012', 'Land Rover Range Rover SUV 2012', 'Lincoln Town Car Sedan 2011', 'Maybach Landaulet Convertible 2012', 'Mazda Tribute SUV 2011', 'McLaren MP4-12C Coupe 2012', 'Mercedes-Benz 300-Class Convertible 1993', 'Mercedes-Benz C-Class Sedan 2012', 'Mercedes-Benz E-Class Sedan 2012', 'Mercedes-Benz S-Class Sedan 2012', 'Mercedes-Benz SL-Class Coupe 2009', 'Mercedes-Benz Sprinter Van 2012', 'MINI Cooper Roadster Convertible 2012', 'Mitsubishi Lancer Sedan 2012', 'Nissan 240SX Coupe 1998', 'Nissan Juke Hatchback 2012', 'Nissan Leaf Hatchback 2012', 'Nissan NV Passenger Van 2012', 'Plymouth Neon Coupe 1999', 'Porsche Panamera Sedan 2012', 'Ram C-V Cargo Van Minivan 2012', 'Rolls-Royce Ghost Sedan 2012', 'Rolls-Royce Phantom Drophead Coupe Convertible 2012', 'Rolls-Royce Phantom Sedan 2012', 'Scion xD Hatchback 2012', 'smart fortwo Convertible 2012', 'Spyker C8 Convertible 2009', 'Spyker C8 Coupe 2009', 'Suzuki Aerio Sedan 2007', 'Suzuki Kizashi Sedan 2012', 'Suzuki SX4 Hatchback 2012', 'Suzuki SX4 Sedan 2012', 'Tesla Model S Sedan 2012', 'Toyota 4Runner SUV 2012', 'Toyota Camry Sedan 2012', 'Toyota Corolla Sedan 2012', 'Toyota Sequoia SUV 2012', 'Volkswagen Beetle Hatchback 2012', 'Volkswagen Golf Hatchback 1991', 'Volkswagen Golf Hatchback 2012', 'Volvo 240 Sedan 1993', 'Volvo C30 Hatchback 2012', 'Volvo XC90 SUV 2007']
# Extracting car name from the car_names.csv
car_names_csv = pd.read_csv("Car+names+and+make.csv", header=None, names=['tags'])
car_names_csv.head()
| tags | |
|---|---|
| 0 | AM General Hummer SUV 2000 |
| 1 | Acura RL Sedan 2012 |
| 2 | Acura TL Sedan 2012 |
| 3 | Acura TL Type-S 2008 |
| 4 | Acura TSX Sedan 2012 |
# Checking if the unique labels of cars csv match with the unique values from sub folder names inside train folder
set(car_names_csv['tags']) == set(train_df['label'])
False
# Checking if the unique labels of cars csv match with the unique values from sub folder names inside test folder
set(car_names_csv['tags']) == set(test_df['label'])
False
- Observations:-
-> There seems to be some disrepancy between the label names provided in csv vs label names from sub folder
set(train_df['label']) == set(test_df['label'])
True
# Checking the discrepency between car names csv vs train labels
set(car_names_csv['tags']) ^ set(train_df['label'])
{'Ram C-V Cargo Van Minivan 2012', 'Ram C/V Cargo Van Minivan 2012'}
- Observations:-
-> The discrepeny seems to be negligible as the name of the car is seems to be same
car_names_csv[car_names_csv['tags']=='Ram C-V Cargo Van Minivan 2012']
| tags |
|---|
train_df[train_df['label']=='Ram C/V Cargo Van Minivan 2012']
| id | image | label | dataset | height | width | n_channels | path |
|---|
# Specifying the annotation csv path
annot_path = "C:\\Users\\adity\\Downloads\\capstone\\Annotations\\"
# Reading the test annotation csv
test_annot_df = pd.read_csv(annot_path+"Test Annotation.csv")
test_annot_df.head()
| Image Name | Bounding Box coordinates | Unnamed: 2 | Unnamed: 3 | Unnamed: 4 | Image class | |
|---|---|---|---|---|---|---|
| 0 | 00001.jpg | 30 | 52 | 246 | 147 | 181 |
| 1 | 00002.jpg | 100 | 19 | 576 | 203 | 103 |
| 2 | 00003.jpg | 51 | 105 | 968 | 659 | 145 |
| 3 | 00004.jpg | 67 | 84 | 581 | 407 | 187 |
| 4 | 00005.jpg | 140 | 151 | 593 | 339 | 185 |
# Checking the number of class labels in the test annotation csv
test_annot_df['Image class'].nunique()
196
# Reading the train annotation csv
train_annot_df = pd.read_csv(annot_path+"Train Annotations.csv")
train_annot_df.head()
| Image Name | Bounding Box coordinates | Unnamed: 2 | Unnamed: 3 | Unnamed: 4 | Image class | |
|---|---|---|---|---|---|---|
| 0 | 00001.jpg | 39 | 116 | 569 | 375 | 14 |
| 1 | 00002.jpg | 36 | 116 | 868 | 587 | 3 |
| 2 | 00003.jpg | 85 | 109 | 601 | 381 | 91 |
| 3 | 00004.jpg | 621 | 393 | 1484 | 1096 | 134 |
| 4 | 00005.jpg | 14 | 36 | 133 | 99 | 106 |
# Checking the number of class labels in the train annotation csv
train_annot_df['Image class'].nunique()
196
# Appending the test and the train annotation dfs
annot_df = pd.concat([train_annot_df.assign(source='train'), test_annot_df.assign(source='test')], ignore_index=True)
annot_df.head()
| Image Name | Bounding Box coordinates | Unnamed: 2 | Unnamed: 3 | Unnamed: 4 | Image class | source | |
|---|---|---|---|---|---|---|---|
| 0 | 00001.jpg | 39 | 116 | 569 | 375 | 14 | train |
| 1 | 00002.jpg | 36 | 116 | 868 | 587 | 3 | train |
| 2 | 00003.jpg | 85 | 109 | 601 | 381 | 91 | train |
| 3 | 00004.jpg | 621 | 393 | 1484 | 1096 | 134 | train |
| 4 | 00005.jpg | 14 | 36 | 133 | 99 | 106 | train |
# Checking the number of class labels in the combined annotation df
annot_df['Image class'].nunique()
196
# Dropping the extension from the name of the image
annot_df['Image Name'] = annot_df['Image Name'].apply(lambda x : x[:-4])
annot_df.head()
| Image Name | Bounding Box coordinates | Unnamed: 2 | Unnamed: 3 | Unnamed: 4 | Image class | source | |
|---|---|---|---|---|---|---|---|
| 0 | 00001 | 39 | 116 | 569 | 375 | 14 | train |
| 1 | 00002 | 36 | 116 | 868 | 587 | 3 | train |
| 2 | 00003 | 85 | 109 | 601 | 381 | 91 | train |
| 3 | 00004 | 621 | 393 | 1484 | 1096 | 134 | train |
| 4 | 00005 | 14 | 36 | 133 | 99 | 106 | train |
# Renaming the attributes in the annotation dataset
annot_df.rename(columns={'Bounding Box coordinates':'x_min', 'Unnamed: 2':'y_min','Unnamed: 3':'x_max','Unnamed: 4':'y_max','Image Name':'id'}, inplace=True)
annot_df.head()
| id | x_min | y_min | x_max | y_max | Image class | source | |
|---|---|---|---|---|---|---|---|
| 0 | 00001 | 39 | 116 | 569 | 375 | 14 | train |
| 1 | 00002 | 36 | 116 | 868 | 587 | 3 | train |
| 2 | 00003 | 85 | 109 | 601 | 381 | 91 | train |
| 3 | 00004 | 621 | 393 | 1484 | 1096 | 134 | train |
| 4 | 00005 | 14 | 36 | 133 | 99 | 106 | train |
# Creating a new attribute names source in the image df which would be used for merging later
data_df['source'] = data_df['dataset'].apply(lambda x : 'test' if x == 'Test Images' else 'train')
data_df.head()
| id | image | label | dataset | height | width | n_channels | path | source | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 00128 | [[[110, 143, 122], [91, 124, 103], [79, 115, 9... | Acura Integra Type R 2001 | Test Images | 600 | 900 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test |
| 1 | 00130 | [[[96, 94, 94], [97, 95, 95], [99, 97, 97], [9... | Acura Integra Type R 2001 | Test Images | 458 | 800 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test |
| 2 | 00386 | [[[0, 0, 0], [0, 0, 0], [4, 4, 4], [0, 0, 0], ... | Acura Integra Type R 2001 | Test Images | 533 | 800 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test |
| 3 | 00565 | [[[253, 253, 253], [253, 253, 253], [253, 253,... | Acura Integra Type R 2001 | Test Images | 380 | 545 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test |
| 4 | 00711 | [[[95, 79, 72], [95, 79, 72], [95, 79, 72], [9... | Acura Integra Type R 2001 | Test Images | 409 | 799 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test |
annot_df.count()
id 16185 x_min 16185 y_min 16185 x_max 16185 y_max 16185 Image class 16185 source 16185 dtype: int64
data_df.count()
id 16185 image 16185 label 16185 dataset 16185 height 16185 width 16185 n_channels 16185 path 16185 source 16185 dtype: int64
# Merging the annotation df and the image df to have a df having the consoliated data
master_df = pd.merge(data_df,annot_df, on=['id','source'], how='outer')
master_df.head()
| id | image | label | dataset | height | width | n_channels | path | source | x_min | y_min | x_max | y_max | Image class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 00001 | [[[254, 254, 254], [254, 254, 254], [254, 254,... | Suzuki Aerio Sedan 2007 | Test Images | 182 | 276 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test | 30 | 52 | 246 | 147 | 181 |
| 1 | 00001 | [[[123, 119, 101], [120, 116, 98], [115, 111, ... | Audi TTS Coupe 2012 | Train Images | 400 | 600 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | train | 39 | 116 | 569 | 375 | 14 |
| 2 | 00002 | [[[119, 122, 126], [117, 120, 124], [114, 117,... | Ferrari 458 Italia Convertible 2012 | Test Images | 360 | 640 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test | 100 | 19 | 576 | 203 | 103 |
| 3 | 00002 | [[[175, 169, 164], [177, 171, 166], [180, 174,... | Acura TL Sedan 2012 | Train Images | 675 | 900 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | train | 36 | 116 | 868 | 587 | 3 |
| 4 | 00003 | [[[166, 160, 161], [164, 158, 159], [162, 156,... | Jeep Patriot SUV 2012 | Test Images | 741 | 1024 | 3 | C:\Users\adity\Downloads\capstone\Car Images\T... | test | 51 | 105 | 968 | 659 | 145 |
master_df.count()
id 16185 image 16185 label 16185 dataset 16185 height 16185 width 16185 n_channels 16185 path 16185 source 16185 x_min 16185 y_min 16185 x_max 16185 y_max 16185 Image class 16185 dtype: int64
# Extracting the year of the car model from the label
master_df['model_year'] = master_df['label'].apply(lambda x: x[-4:])
def value_counts_df(df, col):
df = pd.DataFrame(df[col].value_counts())
df.index.name = col
df.columns = ['count']
return df
countcars_train = value_counts_df(master_df[master_df['source']=='train'],'label')
countcars_train.head(5)
| count | |
|---|---|
| label | |
| GMC Savana Van 2012 | 68 |
| Chrysler 300 SRT-8 2010 | 49 |
| Mercedes-Benz 300-Class Convertible 1993 | 48 |
| Mitsubishi Lancer Sedan 2012 | 48 |
| Jaguar XK XKR 2012 | 47 |
countcars_train.tail(5)
| count | |
|---|---|
| label | |
| Rolls-Royce Phantom Drophead Coupe Convertible 2012 | 31 |
| Chevrolet Express Cargo Van 2007 | 30 |
| Maybach Landaulet Convertible 2012 | 29 |
| FIAT 500 Abarth 2012 | 28 |
| Hyundai Accent Sedan 2012 | 24 |
import matplotlib.pyplot as plt
import seaborn as sns
plt.figure(figsize=(12, 6))
sns.countplot(x='model_year', data=master_df)
plt.title('Distribution of Samples Over Years')
plt.show()
countcars_train
| count | |
|---|---|
| label | |
| GMC Savana Van 2012 | 68 |
| Chrysler 300 SRT-8 2010 | 49 |
| Mercedes-Benz 300-Class Convertible 1993 | 48 |
| Mitsubishi Lancer Sedan 2012 | 48 |
| Jaguar XK XKR 2012 | 47 |
| ... | ... |
| Rolls-Royce Phantom Drophead Coupe Convertible 2012 | 31 |
| Chevrolet Express Cargo Van 2007 | 30 |
| Maybach Landaulet Convertible 2012 | 29 |
| FIAT 500 Abarth 2012 | 28 |
| Hyundai Accent Sedan 2012 | 24 |
196 rows × 1 columns
import matplotlib.pyplot as plt
import seaborn as sns
plt.figure(figsize=(12, 6))
sns.countplot(x='model_year', data=master_df)
plt.title('Distribution of Samples Over Years')
plt.show()
Insights:
The visualization shows the distribution of samples (images of cars) across different years.
You can observe if there are any imbalances in the dataset concerning the years of the car models. This could be useful for understanding if the dataset covers a diverse range of car models from different years or if there's a bias towards certain years.
From the above graph we can observe that we have more than 8000 plus data
import matplotlib.pyplot as plt
import seaborn as sns
plt.figure(figsize=(12, 6))
sns.countplot(x='model_year', data=master_df)
plt.title('Distribution of Samples Over Years')
plt.show()
**Insights:**
* The visualization shows the distribution of samples (images of cars) across different years.
* You can observe if there are any imbalances in the dataset concerning the years of the car models. This could be useful for understanding if the dataset covers a diverse range of car models from different years or if there's a bias towards certain years.
* From the above graph we can observe that we have more than 8000 plus data
collected which consist of both train and test for car models year 2012 followed by year 2007,2009,2010,2011 & 2008.
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(10,5))
ax = fig.add_axes([0,0,1,1])
carNames = countcars_train.iloc[:10,:].index
carCounts = countcars_train.iloc[:10,:]['count']
ax.bar(carNames,carCounts)
plt.xticks(rotation=90)
plt.show()
Insights:
- This bar plot displays the counts of the top 10 car models present in the dataset.
- You can identify the most prevalent car models in the dataset. This information could be useful for understanding which car models are well-represented and which ones are less common. It also gives an idea of the dataset's diversity.
import matplotlib.pyplot as plt
master_df['source'].value_counts().plot.pie(autopct='%1.1f%%', startangle=90)
plt.title('Distribution of Source in Test and Train')
plt.show()
Insights:
This pie chart visualizes the distribution of samples between the training and testing datasets
You can see the proportion of data allocated for training and testing. It's important to have a balanced distribution between training and testing sets to ensure the model's generalization performance.
From the above graph we can see that 50% of data is train data and train images. Where as on the other hand we have only 49% of test data and test images.
# Step 4: Display images with bounding box.[ 5 points ]
def bounding_box(ser):
x0 = ser['x_min']
x1 = ser['x_max']
y0 = ser['y_min']
y1 = ser['y_max']
#Resize the image and draw a rectangle as per bounding box information
#Convert BGR format (used by opencv to RGB format used by matplotlib)
img = cv2.imread(ser['path'])
img = cv2.resize(img,(224, 224))
w = ser['width']
h = ser['height']
x_ratio = 224/w
y_ratio = 224/h
start_point = (int(x0*x_ratio), int(y0*y_ratio))
# start_point = (int(x0), int(y0))
end_point = (int(x1*x_ratio), int(y1*y_ratio))
# end_point = (int(x1), int(y1))
cv2.rectangle(img, start_point, end_point, color=(0,255,0), thickness=2)
# cv2.imwrite("{}.jpg".format(ser['id']), img)
# cv2.imshow("image_{}".format(ser['id']), img)
# cv2.waitKey()
# cv2.destroyAllWindows()
img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
#Draw image using matplotlib
plt.suptitle(ser['label'])
plt.imshow(img)
plt.show()
# for i in range(master_df.shape[0]):
from matplotlib import pyplot as plt
for i in range(20):
bounding_box(master_df.iloc[i, :])
Insights:
- This visualization displays a subset of images along with their bounding boxes, indicating the regions of interest (cars) in the images.
- This visualization helps in understanding the effectiveness of the bounding box annotations. It also provides a visual check to ensure that the bounding boxes align with the objects of interest (cars) in the images.
master_df.shape
(16185, 15)
master_df.count()
id 16185 image 16185 label 16185 dataset 16185 height 16185 width 16185 n_channels 16185 path 16185 source 16185 x_min 16185 y_min 16185 x_max 16185 y_max 16185 Image class 16185 model_year 16185 dtype: int64
master_df.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 16185 entries, 0 to 16184 Data columns (total 15 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 id 16185 non-null object 1 image 16185 non-null object 2 label 16185 non-null object 3 dataset 16185 non-null object 4 height 16185 non-null int64 5 width 16185 non-null int64 6 n_channels 16185 non-null int64 7 path 16185 non-null object 8 source 16185 non-null object 9 x_min 16185 non-null int64 10 y_min 16185 non-null int64 11 x_max 16185 non-null int64 12 y_max 16185 non-null int64 13 Image class 16185 non-null int64 14 model_year 16185 non-null object dtypes: int64(8), object(7) memory usage: 1.9+ MB
master_df['n_channels'].value_counts()
n_channels 3 16185 Name: count, dtype: int64
type(y)
list
Model Building¶
# Step 5: Design, train and test basic CNN models to classify the car. [ 10 points ]
# Importing Libraries for Model Building & evaluating performance
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.models import Sequential, Model
from tensorflow.keras.layers import *
from tensorflow.keras.layers import Dense, Dropout, BatchNormalization, Flatten, MaxPooling2D, Conv2D, Convolution2D, Activation
from tensorflow.keras.optimizers import Adam
from tensorflow.keras import optimizers
import scipy
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint, ReduceLROnPlateau
# Image augmentation for CNN model
# Create data generator for training data with data augmentation and normalizing all
# values by 255
train_datagen = ImageDataGenerator(rescale = 1./255,
rotation_range=20,
shear_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True)
test_datagen = ImageDataGenerator(rescale = 1./255)
# Setting training data generator's source directory
# Setting the target size to resize all the images to (64,64) as the model input layer expects 32X32 images
training_set = train_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Train Images\\',
target_size = (224, 224),
batch_size = 16,
class_mode = 'categorical')
# Setting testing data generator's source directory
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 16,
class_mode = 'categorical')
Found 8144 images belonging to 196 classes. Found 8041 images belonging to 196 classes.
# Model 1
# Initialising the CNN classifier
model_0 = Sequential()
# Add a Convolution layer with 32 kernels of 3X3 shape with activation function ReLU
model_0.add(Conv2D(32, (3, 3), input_shape = (224, 224, 3), activation = 'relu', padding = 'same'))
# Add a Max Pooling layer of size 2X2
model_0.add(MaxPooling2D(pool_size = (2, 2)))
# Add another Convolution layer with 32 kernels of 3X3 shape with activation function ReLU
model_0.add(Conv2D(64, (3, 3), activation = 'relu', padding = 'same'))
# Adding another pooling layer
model_0.add(MaxPooling2D(pool_size = (2, 2)))
# Add another Convolution layer with 32 kernels of 3X3 shape with activation function ReLU
model_0.add(Conv2D(128, (3, 3), activation = 'relu', padding = 'same'))
# Adding another pooling layer
model_0.add(MaxPooling2D(pool_size = (2, 2)))
# Flattening the layer before fully connected layers
model_0.add(Flatten())
# Adding a fully connected layer with 512 neurons
model_0.add(Dense(units = 512, activation = 'relu'))
# Adding dropout with probability 0.4
model_0.add(Dropout(0.4))
# Adding a fully connected layer with 128 neurons
model_0.add(Dense(units = 128, activation = 'relu'))
# The final output layer with 5 neuron to predict the categorical classifcation
model_0.add(Dense(units = 196, activation = 'softmax'))
earlystopper = EarlyStopping(patience=8, verbose=1)
checkpointer = ModelCheckpoint(filepath = 'model_zero7.{epoch:02d}-{val_accuracy:.6f}.hdf5',
verbose=1,
save_best_only=True, save_weights_only = True)
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2,
patience=2, min_lr=0.000001, verbose=1, cooldown=1)
opt = Adam()
model_0.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
model_0.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
conv2d_2 (Conv2D) (None, 56, 56, 128) 73856
max_pooling2d_2 (MaxPooling (None, 28, 28, 128) 0
2D)
flatten (Flatten) (None, 100352) 0
dense (Dense) (None, 512) 51380736
dropout (Dropout) (None, 512) 0
dense_1 (Dense) (None, 128) 65664
dense_2 (Dense) (None, 196) 25284
=================================================================
Total params: 51,564,932
Trainable params: 51,564,932
Non-trainable params: 0
_________________________________________________________________
# There are 8144 training images and 8041 test images in total
history = model_0.fit_generator(training_set,
steps_per_epoch = int(8144/32),
epochs = 10,
validation_data = test_set,
validation_steps = int(8041/32),
callbacks=[earlystopper, checkpointer, reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_12320\135954528.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = model_0.fit_generator(training_set,
Epoch 1/10 254/254 [==============================] - ETA: 0s - loss: 5.2861 - accuracy: 0.0057 Epoch 1: val_loss improved from inf to 5.27742, saving model to model_zero7.01-0.005976.hdf5 254/254 [==============================] - 78s 277ms/step - loss: 5.2861 - accuracy: 0.0057 - val_loss: 5.2774 - val_accuracy: 0.0060 - lr: 0.0010 Epoch 2/10 254/254 [==============================] - ETA: 0s - loss: 5.2785 - accuracy: 0.0071 Epoch 2: val_loss improved from 5.27742 to 5.27673, saving model to model_zero7.02-0.008466.hdf5 254/254 [==============================] - 64s 252ms/step - loss: 5.2785 - accuracy: 0.0071 - val_loss: 5.2767 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 3/10 254/254 [==============================] - ETA: 0s - loss: 5.2775 - accuracy: 0.0091 Epoch 3: val_loss improved from 5.27673 to 5.27635, saving model to model_zero7.03-0.008964.hdf5 254/254 [==============================] - 63s 249ms/step - loss: 5.2775 - accuracy: 0.0091 - val_loss: 5.2764 - val_accuracy: 0.0090 - lr: 0.0010 Epoch 4/10 254/254 [==============================] - ETA: 0s - loss: 5.2771 - accuracy: 0.0086 Epoch 4: val_loss improved from 5.27635 to 5.27606, saving model to model_zero7.04-0.008715.hdf5 254/254 [==============================] - 62s 245ms/step - loss: 5.2771 - accuracy: 0.0086 - val_loss: 5.2761 - val_accuracy: 0.0087 - lr: 0.0010 Epoch 5/10 254/254 [==============================] - ETA: 0s - loss: 5.2765 - accuracy: 0.0096 Epoch 5: val_loss improved from 5.27606 to 5.27480, saving model to model_zero7.05-0.008715.hdf5 254/254 [==============================] - 66s 258ms/step - loss: 5.2765 - accuracy: 0.0096 - val_loss: 5.2748 - val_accuracy: 0.0087 - lr: 0.0010 Epoch 6/10 254/254 [==============================] - ETA: 0s - loss: 5.2756 - accuracy: 0.0096 Epoch 6: val_loss improved from 5.27480 to 5.27417, saving model to model_zero7.06-0.008715.hdf5 254/254 [==============================] - 64s 252ms/step - loss: 5.2756 - accuracy: 0.0096 - val_loss: 5.2742 - val_accuracy: 0.0087 - lr: 0.0010 Epoch 7/10 254/254 [==============================] - ETA: 0s - loss: 5.2772 - accuracy: 0.0076 Epoch 7: val_loss did not improve from 5.27417 254/254 [==============================] - 64s 252ms/step - loss: 5.2772 - accuracy: 0.0076 - val_loss: 5.2747 - val_accuracy: 0.0080 - lr: 0.0010 Epoch 8/10 254/254 [==============================] - ETA: 0s - loss: 5.2759 - accuracy: 0.0086 Epoch 8: val_loss improved from 5.27417 to 5.27374, saving model to model_zero7.08-0.009462.hdf5 254/254 [==============================] - 64s 250ms/step - loss: 5.2759 - accuracy: 0.0086 - val_loss: 5.2737 - val_accuracy: 0.0095 - lr: 0.0010 Epoch 9/10 254/254 [==============================] - ETA: 0s - loss: 5.2765 - accuracy: 0.0086 Epoch 9: val_loss did not improve from 5.27374 254/254 [==============================] - 61s 240ms/step - loss: 5.2765 - accuracy: 0.0086 - val_loss: 5.2745 - val_accuracy: 0.0082 - lr: 0.0010 Epoch 10/10 254/254 [==============================] - ETA: 0s - loss: 5.2764 - accuracy: 0.0079 Epoch 10: val_loss did not improve from 5.27374 Epoch 10: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026. 254/254 [==============================] - 63s 249ms/step - loss: 5.2764 - accuracy: 0.0079 - val_loss: 5.2746 - val_accuracy: 0.0092 - lr: 0.0010
model_0.save('./model_0.h5')
model_0.save_weights('./model_0.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
Insights:
The training accuracy steadily increases over epochs, indicating that the model is effectively learning from the training data. However, the validation accuracy tends to plateau or even slightly decrease after around 4 epochs. This suggests that while the model performs well on the training data, it struggles to generalize to unseen validation data after a certain point.The final validation accuracy is approximately 0.4, indicating that the model correctly classifies around 40% of the validation images.
The training loss consistently decreases over epochs, indicating that the model is effectively minimizing errors on the training data. Conversely, the validation loss initially decreases but then stabilizes or slightly increases after around 4 epochs. This divergence between training and validation loss suggests overfitting, where the model becomes too specialized to the training data and fails to generalize well to new data. The final validation loss is around 2.5, indicating that the model, on average, incurs a loss of approximately 2.5 on the validation data.
Model 1 demonstrates a typical behavior of overfitting, as evidenced by the plateauing or decreasing validation accuracy and the stabilization or slight increase in validation loss.While the model achieves decent training accuracy and relatively low training loss, its performance on unseen validation data is suboptimal, indicating the need for regularization techniques or architectural modifications to mitigate overfitting.
# Check the indices Image Data Generator has allotted to each folder
classes_dict = training_set.class_indices
print(classes_dict)
# Creating a list of classes in test set for showing the result as the folder name
prediction_class = []
for class_name,index in classes_dict.items():
prediction_class.append(class_name)
{'AM General Hummer SUV 2000': 0, 'Acura Integra Type R 2001': 1, 'Acura RL Sedan 2012': 2, 'Acura TL Sedan 2012': 3, 'Acura TL Type-S 2008': 4, 'Acura TSX Sedan 2012': 5, 'Acura ZDX Hatchback 2012': 6, 'Aston Martin V8 Vantage Convertible 2012': 7, 'Aston Martin V8 Vantage Coupe 2012': 8, 'Aston Martin Virage Convertible 2012': 9, 'Aston Martin Virage Coupe 2012': 10, 'Audi 100 Sedan 1994': 11, 'Audi 100 Wagon 1994': 12, 'Audi A5 Coupe 2012': 13, 'Audi R8 Coupe 2012': 14, 'Audi RS 4 Convertible 2008': 15, 'Audi S4 Sedan 2007': 16, 'Audi S4 Sedan 2012': 17, 'Audi S5 Convertible 2012': 18, 'Audi S5 Coupe 2012': 19, 'Audi S6 Sedan 2011': 20, 'Audi TT Hatchback 2011': 21, 'Audi TT RS Coupe 2012': 22, 'Audi TTS Coupe 2012': 23, 'Audi V8 Sedan 1994': 24, 'BMW 1 Series Convertible 2012': 25, 'BMW 1 Series Coupe 2012': 26, 'BMW 3 Series Sedan 2012': 27, 'BMW 3 Series Wagon 2012': 28, 'BMW 6 Series Convertible 2007': 29, 'BMW ActiveHybrid 5 Sedan 2012': 30, 'BMW M3 Coupe 2012': 31, 'BMW M5 Sedan 2010': 32, 'BMW M6 Convertible 2010': 33, 'BMW X3 SUV 2012': 34, 'BMW X5 SUV 2007': 35, 'BMW X6 SUV 2012': 36, 'BMW Z4 Convertible 2012': 37, 'Bentley Arnage Sedan 2009': 38, 'Bentley Continental Flying Spur Sedan 2007': 39, 'Bentley Continental GT Coupe 2007': 40, 'Bentley Continental GT Coupe 2012': 41, 'Bentley Continental Supersports Conv. Convertible 2012': 42, 'Bentley Mulsanne Sedan 2011': 43, 'Bugatti Veyron 16.4 Convertible 2009': 44, 'Bugatti Veyron 16.4 Coupe 2009': 45, 'Buick Enclave SUV 2012': 46, 'Buick Rainier SUV 2007': 47, 'Buick Regal GS 2012': 48, 'Buick Verano Sedan 2012': 49, 'Cadillac CTS-V Sedan 2012': 50, 'Cadillac Escalade EXT Crew Cab 2007': 51, 'Cadillac SRX SUV 2012': 52, 'Chevrolet Avalanche Crew Cab 2012': 53, 'Chevrolet Camaro Convertible 2012': 54, 'Chevrolet Cobalt SS 2010': 55, 'Chevrolet Corvette Convertible 2012': 56, 'Chevrolet Corvette Ron Fellows Edition Z06 2007': 57, 'Chevrolet Corvette ZR1 2012': 58, 'Chevrolet Express Cargo Van 2007': 59, 'Chevrolet Express Van 2007': 60, 'Chevrolet HHR SS 2010': 61, 'Chevrolet Impala Sedan 2007': 62, 'Chevrolet Malibu Hybrid Sedan 2010': 63, 'Chevrolet Malibu Sedan 2007': 64, 'Chevrolet Monte Carlo Coupe 2007': 65, 'Chevrolet Silverado 1500 Classic Extended Cab 2007': 66, 'Chevrolet Silverado 1500 Extended Cab 2012': 67, 'Chevrolet Silverado 1500 Hybrid Crew Cab 2012': 68, 'Chevrolet Silverado 1500 Regular Cab 2012': 69, 'Chevrolet Silverado 2500HD Regular Cab 2012': 70, 'Chevrolet Sonic Sedan 2012': 71, 'Chevrolet Tahoe Hybrid SUV 2012': 72, 'Chevrolet TrailBlazer SS 2009': 73, 'Chevrolet Traverse SUV 2012': 74, 'Chrysler 300 SRT-8 2010': 75, 'Chrysler Aspen SUV 2009': 76, 'Chrysler Crossfire Convertible 2008': 77, 'Chrysler PT Cruiser Convertible 2008': 78, 'Chrysler Sebring Convertible 2010': 79, 'Chrysler Town and Country Minivan 2012': 80, 'Daewoo Nubira Wagon 2002': 81, 'Dodge Caliber Wagon 2007': 82, 'Dodge Caliber Wagon 2012': 83, 'Dodge Caravan Minivan 1997': 84, 'Dodge Challenger SRT8 2011': 85, 'Dodge Charger SRT-8 2009': 86, 'Dodge Charger Sedan 2012': 87, 'Dodge Dakota Club Cab 2007': 88, 'Dodge Dakota Crew Cab 2010': 89, 'Dodge Durango SUV 2007': 90, 'Dodge Durango SUV 2012': 91, 'Dodge Journey SUV 2012': 92, 'Dodge Magnum Wagon 2008': 93, 'Dodge Ram Pickup 3500 Crew Cab 2010': 94, 'Dodge Ram Pickup 3500 Quad Cab 2009': 95, 'Dodge Sprinter Cargo Van 2009': 96, 'Eagle Talon Hatchback 1998': 97, 'FIAT 500 Abarth 2012': 98, 'FIAT 500 Convertible 2012': 99, 'Ferrari 458 Italia Convertible 2012': 100, 'Ferrari 458 Italia Coupe 2012': 101, 'Ferrari California Convertible 2012': 102, 'Ferrari FF Coupe 2012': 103, 'Fisker Karma Sedan 2012': 104, 'Ford E-Series Wagon Van 2012': 105, 'Ford Edge SUV 2012': 106, 'Ford Expedition EL SUV 2009': 107, 'Ford F-150 Regular Cab 2007': 108, 'Ford F-150 Regular Cab 2012': 109, 'Ford F-450 Super Duty Crew Cab 2012': 110, 'Ford Fiesta Sedan 2012': 111, 'Ford Focus Sedan 2007': 112, 'Ford Freestar Minivan 2007': 113, 'Ford GT Coupe 2006': 114, 'Ford Mustang Convertible 2007': 115, 'Ford Ranger SuperCab 2011': 116, 'GMC Acadia SUV 2012': 117, 'GMC Canyon Extended Cab 2012': 118, 'GMC Savana Van 2012': 119, 'GMC Terrain SUV 2012': 120, 'GMC Yukon Hybrid SUV 2012': 121, 'Geo Metro Convertible 1993': 122, 'HUMMER H2 SUT Crew Cab 2009': 123, 'HUMMER H3T Crew Cab 2010': 124, 'Honda Accord Coupe 2012': 125, 'Honda Accord Sedan 2012': 126, 'Honda Odyssey Minivan 2007': 127, 'Honda Odyssey Minivan 2012': 128, 'Hyundai Accent Sedan 2012': 129, 'Hyundai Azera Sedan 2012': 130, 'Hyundai Elantra Sedan 2007': 131, 'Hyundai Elantra Touring Hatchback 2012': 132, 'Hyundai Genesis Sedan 2012': 133, 'Hyundai Santa Fe SUV 2012': 134, 'Hyundai Sonata Hybrid Sedan 2012': 135, 'Hyundai Sonata Sedan 2012': 136, 'Hyundai Tucson SUV 2012': 137, 'Hyundai Veloster Hatchback 2012': 138, 'Hyundai Veracruz SUV 2012': 139, 'Infiniti G Coupe IPL 2012': 140, 'Infiniti QX56 SUV 2011': 141, 'Isuzu Ascender SUV 2008': 142, 'Jaguar XK XKR 2012': 143, 'Jeep Compass SUV 2012': 144, 'Jeep Grand Cherokee SUV 2012': 145, 'Jeep Liberty SUV 2012': 146, 'Jeep Patriot SUV 2012': 147, 'Jeep Wrangler SUV 2012': 148, 'Lamborghini Aventador Coupe 2012': 149, 'Lamborghini Diablo Coupe 2001': 150, 'Lamborghini Gallardo LP 570-4 Superleggera 2012': 151, 'Lamborghini Reventon Coupe 2008': 152, 'Land Rover LR2 SUV 2012': 153, 'Land Rover Range Rover SUV 2012': 154, 'Lincoln Town Car Sedan 2011': 155, 'MINI Cooper Roadster Convertible 2012': 156, 'Maybach Landaulet Convertible 2012': 157, 'Mazda Tribute SUV 2011': 158, 'McLaren MP4-12C Coupe 2012': 159, 'Mercedes-Benz 300-Class Convertible 1993': 160, 'Mercedes-Benz C-Class Sedan 2012': 161, 'Mercedes-Benz E-Class Sedan 2012': 162, 'Mercedes-Benz S-Class Sedan 2012': 163, 'Mercedes-Benz SL-Class Coupe 2009': 164, 'Mercedes-Benz Sprinter Van 2012': 165, 'Mitsubishi Lancer Sedan 2012': 166, 'Nissan 240SX Coupe 1998': 167, 'Nissan Juke Hatchback 2012': 168, 'Nissan Leaf Hatchback 2012': 169, 'Nissan NV Passenger Van 2012': 170, 'Plymouth Neon Coupe 1999': 171, 'Porsche Panamera Sedan 2012': 172, 'Ram C-V Cargo Van Minivan 2012': 173, 'Rolls-Royce Ghost Sedan 2012': 174, 'Rolls-Royce Phantom Drophead Coupe Convertible 2012': 175, 'Rolls-Royce Phantom Sedan 2012': 176, 'Scion xD Hatchback 2012': 177, 'Spyker C8 Convertible 2009': 178, 'Spyker C8 Coupe 2009': 179, 'Suzuki Aerio Sedan 2007': 180, 'Suzuki Kizashi Sedan 2012': 181, 'Suzuki SX4 Hatchback 2012': 182, 'Suzuki SX4 Sedan 2012': 183, 'Tesla Model S Sedan 2012': 184, 'Toyota 4Runner SUV 2012': 185, 'Toyota Camry Sedan 2012': 186, 'Toyota Corolla Sedan 2012': 187, 'Toyota Sequoia SUV 2012': 188, 'Volkswagen Beetle Hatchback 2012': 189, 'Volkswagen Golf Hatchback 1991': 190, 'Volkswagen Golf Hatchback 2012': 191, 'Volvo 240 Sedan 1993': 192, 'Volvo C30 Hatchback 2012': 193, 'Volvo XC90 SUV 2007': 194, 'smart fortwo Convertible 2012': 195}
# Performance of 1st Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 32,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = model_0.predict_generator(test_set, int(8041/32+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_12320\4093217754.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = model_0.predict_generator(test_set, int(8041/32+1))
[[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
...
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]]
precision recall f1-score support
AM General Hummer SUV 2000 0.00 0.00 0.00 44
Acura Integra Type R 2001 0.00 0.00 0.00 44
Acura RL Sedan 2012 0.00 0.00 0.00 32
Acura TL Sedan 2012 0.00 0.00 0.00 43
Acura TL Type-S 2008 0.00 0.00 0.00 42
Acura TSX Sedan 2012 0.00 0.00 0.00 40
Acura ZDX Hatchback 2012 0.00 0.00 0.00 39
Aston Martin V8 Vantage Convertible 2012 0.00 0.00 0.00 45
Aston Martin V8 Vantage Coupe 2012 0.00 0.00 0.00 41
Aston Martin Virage Convertible 2012 0.00 0.00 0.00 33
Aston Martin Virage Coupe 2012 0.00 0.00 0.00 38
Audi 100 Sedan 1994 0.00 0.00 0.00 40
Audi 100 Wagon 1994 0.00 0.00 0.00 42
Audi A5 Coupe 2012 0.00 0.00 0.00 41
Audi R8 Coupe 2012 0.00 0.00 0.00 43
Audi RS 4 Convertible 2008 0.00 0.00 0.00 36
Audi S4 Sedan 2007 0.00 0.00 0.00 45
Audi S4 Sedan 2012 0.00 0.00 0.00 39
Audi S5 Convertible 2012 0.00 0.00 0.00 42
Audi S5 Coupe 2012 0.00 0.00 0.00 42
Audi S6 Sedan 2011 0.00 0.00 0.00 46
Audi TT Hatchback 2011 0.00 0.00 0.00 40
Audi TT RS Coupe 2012 0.00 0.00 0.00 39
Audi TTS Coupe 2012 0.00 0.00 0.00 42
Audi V8 Sedan 1994 0.00 0.00 0.00 43
BMW 1 Series Convertible 2012 0.00 0.00 0.00 35
BMW 1 Series Coupe 2012 0.00 0.00 0.00 41
BMW 3 Series Sedan 2012 0.00 0.00 0.00 42
BMW 3 Series Wagon 2012 0.00 0.00 0.00 41
BMW 6 Series Convertible 2007 0.00 0.00 0.00 44
BMW ActiveHybrid 5 Sedan 2012 0.00 0.00 0.00 34
BMW M3 Coupe 2012 0.00 0.00 0.00 44
BMW M5 Sedan 2010 0.00 0.00 0.00 41
BMW M6 Convertible 2010 0.00 0.00 0.00 41
BMW X3 SUV 2012 0.00 0.00 0.00 38
BMW X5 SUV 2007 0.00 0.00 0.00 41
BMW X6 SUV 2012 0.00 0.00 0.00 42
BMW Z4 Convertible 2012 0.00 0.00 0.00 40
Bentley Arnage Sedan 2009 0.00 0.00 0.00 39
Bentley Continental Flying Spur Sedan 2007 0.00 0.00 0.00 44
Bentley Continental GT Coupe 2007 0.00 0.00 0.00 46
Bentley Continental GT Coupe 2012 0.00 0.00 0.00 34
Bentley Continental Supersports Conv. Convertible 2012 0.00 0.00 0.00 36
Bentley Mulsanne Sedan 2011 0.00 0.00 0.00 35
Bugatti Veyron 16.4 Convertible 2009 0.00 0.00 0.00 32
Bugatti Veyron 16.4 Coupe 2009 0.00 0.00 0.00 43
Buick Enclave SUV 2012 0.00 0.00 0.00 42
Buick Rainier SUV 2007 0.00 0.00 0.00 42
Buick Regal GS 2012 0.00 0.00 0.00 35
Buick Verano Sedan 2012 0.00 0.00 0.00 37
Cadillac CTS-V Sedan 2012 0.00 0.00 0.00 43
Cadillac Escalade EXT Crew Cab 2007 0.00 0.00 0.00 44
Cadillac SRX SUV 2012 0.00 0.00 0.00 41
Chevrolet Avalanche Crew Cab 2012 0.00 0.00 0.00 45
Chevrolet Camaro Convertible 2012 0.00 0.00 0.00 44
Chevrolet Cobalt SS 2010 0.00 0.00 0.00 41
Chevrolet Corvette Convertible 2012 0.00 0.00 0.00 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.00 0.00 0.00 37
Chevrolet Corvette ZR1 2012 0.00 0.00 0.00 46
Chevrolet Express Cargo Van 2007 0.00 0.00 0.00 29
Chevrolet Express Van 2007 0.00 0.00 0.00 35
Chevrolet HHR SS 2010 0.00 0.00 0.00 36
Chevrolet Impala Sedan 2007 0.00 0.00 0.00 43
Chevrolet Malibu Hybrid Sedan 2010 0.00 0.00 0.00 38
Chevrolet Malibu Sedan 2007 0.00 0.00 0.00 44
Chevrolet Monte Carlo Coupe 2007 0.00 0.00 0.00 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.00 0.00 0.00 42
Chevrolet Silverado 1500 Extended Cab 2012 0.00 0.00 0.00 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.00 0.00 0.00 40
Chevrolet Silverado 1500 Regular Cab 2012 0.00 0.00 0.00 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.00 0.00 0.00 38
Chevrolet Sonic Sedan 2012 0.00 0.00 0.00 44
Chevrolet Tahoe Hybrid SUV 2012 0.00 0.00 0.00 37
Chevrolet TrailBlazer SS 2009 0.00 0.00 0.00 40
Chevrolet Traverse SUV 2012 0.00 0.00 0.00 44
Chrysler 300 SRT-8 2010 0.00 0.00 0.00 48
Chrysler Aspen SUV 2009 0.00 0.00 0.00 43
Chrysler Crossfire Convertible 2008 0.00 0.00 0.00 43
Chrysler PT Cruiser Convertible 2008 0.00 0.00 0.00 45
Chrysler Sebring Convertible 2010 0.00 0.00 0.00 40
Chrysler Town and Country Minivan 2012 0.00 0.00 0.00 37
Daewoo Nubira Wagon 2002 0.00 0.00 0.00 45
Dodge Caliber Wagon 2007 0.00 0.00 0.00 42
Dodge Caliber Wagon 2012 0.00 0.00 0.00 40
Dodge Caravan Minivan 1997 0.00 0.00 0.00 43
Dodge Challenger SRT8 2011 0.00 0.00 0.00 39
Dodge Charger SRT-8 2009 0.00 0.00 0.00 42
Dodge Charger Sedan 2012 0.00 0.00 0.00 41
Dodge Dakota Club Cab 2007 0.00 0.00 0.00 38
Dodge Dakota Crew Cab 2010 0.00 0.00 0.00 41
Dodge Durango SUV 2007 0.00 0.00 0.00 45
Dodge Durango SUV 2012 0.00 0.00 0.00 43
Dodge Journey SUV 2012 0.00 0.00 0.00 44
Dodge Magnum Wagon 2008 0.00 0.00 0.00 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.00 0.00 0.00 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.00 0.00 0.00 44
Dodge Sprinter Cargo Van 2009 0.00 0.00 0.00 39
Eagle Talon Hatchback 1998 0.00 0.00 0.00 46
FIAT 500 Abarth 2012 0.00 0.00 0.00 27
FIAT 500 Convertible 2012 0.00 0.00 0.00 33
Ferrari 458 Italia Convertible 2012 0.00 0.00 0.00 39
Ferrari 458 Italia Coupe 2012 0.00 0.00 0.00 42
Ferrari California Convertible 2012 0.00 0.00 0.00 39
Ferrari FF Coupe 2012 0.00 0.00 0.00 42
Fisker Karma Sedan 2012 0.00 0.00 0.00 43
Ford E-Series Wagon Van 2012 0.00 0.00 0.00 37
Ford Edge SUV 2012 0.00 0.00 0.00 43
Ford Expedition EL SUV 2009 0.00 0.00 0.00 44
Ford F-150 Regular Cab 2007 0.00 0.00 0.00 45
Ford F-150 Regular Cab 2012 0.00 0.00 0.00 42
Ford F-450 Super Duty Crew Cab 2012 0.00 0.00 0.00 41
Ford Fiesta Sedan 2012 0.00 0.00 0.00 42
Ford Focus Sedan 2007 0.00 0.00 0.00 45
Ford Freestar Minivan 2007 0.00 0.00 0.00 44
Ford GT Coupe 2006 0.00 0.00 0.00 45
Ford Mustang Convertible 2007 0.00 0.00 0.00 44
Ford Ranger SuperCab 2011 0.00 0.00 0.00 42
GMC Acadia SUV 2012 0.00 0.00 0.00 44
GMC Canyon Extended Cab 2012 0.00 0.00 0.00 40
GMC Savana Van 2012 0.01 1.00 0.02 68
GMC Terrain SUV 2012 0.00 0.00 0.00 41
GMC Yukon Hybrid SUV 2012 0.00 0.00 0.00 42
Geo Metro Convertible 1993 0.00 0.00 0.00 44
HUMMER H2 SUT Crew Cab 2009 0.00 0.00 0.00 43
HUMMER H3T Crew Cab 2010 0.00 0.00 0.00 39
Honda Accord Coupe 2012 0.00 0.00 0.00 39
Honda Accord Sedan 2012 0.00 0.00 0.00 38
Honda Odyssey Minivan 2007 0.00 0.00 0.00 41
Honda Odyssey Minivan 2012 0.00 0.00 0.00 42
Hyundai Accent Sedan 2012 0.00 0.00 0.00 24
Hyundai Azera Sedan 2012 0.00 0.00 0.00 42
Hyundai Elantra Sedan 2007 0.00 0.00 0.00 42
Hyundai Elantra Touring Hatchback 2012 0.00 0.00 0.00 42
Hyundai Genesis Sedan 2012 0.00 0.00 0.00 43
Hyundai Santa Fe SUV 2012 0.00 0.00 0.00 42
Hyundai Sonata Hybrid Sedan 2012 0.00 0.00 0.00 33
Hyundai Sonata Sedan 2012 0.00 0.00 0.00 39
Hyundai Tucson SUV 2012 0.00 0.00 0.00 43
Hyundai Veloster Hatchback 2012 0.00 0.00 0.00 41
Hyundai Veracruz SUV 2012 0.00 0.00 0.00 42
Infiniti G Coupe IPL 2012 0.00 0.00 0.00 34
Infiniti QX56 SUV 2011 0.00 0.00 0.00 32
Isuzu Ascender SUV 2008 0.00 0.00 0.00 40
Jaguar XK XKR 2012 0.00 0.00 0.00 46
Jeep Compass SUV 2012 0.00 0.00 0.00 42
Jeep Grand Cherokee SUV 2012 0.00 0.00 0.00 45
Jeep Liberty SUV 2012 0.00 0.00 0.00 44
Jeep Patriot SUV 2012 0.00 0.00 0.00 44
Jeep Wrangler SUV 2012 0.00 0.00 0.00 43
Lamborghini Aventador Coupe 2012 0.00 0.00 0.00 43
Lamborghini Diablo Coupe 2001 0.00 0.00 0.00 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.00 0.00 0.00 35
Lamborghini Reventon Coupe 2008 0.00 0.00 0.00 36
Land Rover LR2 SUV 2012 0.00 0.00 0.00 42
Land Rover Range Rover SUV 2012 0.00 0.00 0.00 42
Lincoln Town Car Sedan 2011 0.00 0.00 0.00 39
MINI Cooper Roadster Convertible 2012 0.00 0.00 0.00 36
Maybach Landaulet Convertible 2012 0.00 0.00 0.00 29
Mazda Tribute SUV 2011 0.00 0.00 0.00 36
McLaren MP4-12C Coupe 2012 0.00 0.00 0.00 44
Mercedes-Benz 300-Class Convertible 1993 0.00 0.00 0.00 48
Mercedes-Benz C-Class Sedan 2012 0.00 0.00 0.00 45
Mercedes-Benz E-Class Sedan 2012 0.00 0.00 0.00 43
Mercedes-Benz S-Class Sedan 2012 0.00 0.00 0.00 44
Mercedes-Benz SL-Class Coupe 2009 0.00 0.00 0.00 36
Mercedes-Benz Sprinter Van 2012 0.00 0.00 0.00 41
Mitsubishi Lancer Sedan 2012 0.00 0.00 0.00 47
Nissan 240SX Coupe 1998 0.00 0.00 0.00 46
Nissan Juke Hatchback 2012 0.00 0.00 0.00 44
Nissan Leaf Hatchback 2012 0.00 0.00 0.00 42
Nissan NV Passenger Van 2012 0.00 0.00 0.00 38
Plymouth Neon Coupe 1999 0.00 0.00 0.00 44
Porsche Panamera Sedan 2012 0.00 0.00 0.00 43
Ram C-V Cargo Van Minivan 2012 0.00 0.00 0.00 41
Rolls-Royce Ghost Sedan 2012 0.00 0.00 0.00 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.00 0.00 0.00 30
Rolls-Royce Phantom Sedan 2012 0.00 0.00 0.00 44
Scion xD Hatchback 2012 0.00 0.00 0.00 41
Spyker C8 Convertible 2009 0.00 0.00 0.00 45
Spyker C8 Coupe 2009 0.00 0.00 0.00 42
Suzuki Aerio Sedan 2007 0.00 0.00 0.00 38
Suzuki Kizashi Sedan 2012 0.00 0.00 0.00 46
Suzuki SX4 Hatchback 2012 0.00 0.00 0.00 42
Suzuki SX4 Sedan 2012 0.00 0.00 0.00 40
Tesla Model S Sedan 2012 0.00 0.00 0.00 38
Toyota 4Runner SUV 2012 0.00 0.00 0.00 40
Toyota Camry Sedan 2012 0.00 0.00 0.00 43
Toyota Corolla Sedan 2012 0.00 0.00 0.00 43
Toyota Sequoia SUV 2012 0.00 0.00 0.00 38
Volkswagen Beetle Hatchback 2012 0.00 0.00 0.00 42
Volkswagen Golf Hatchback 1991 0.00 0.00 0.00 46
Volkswagen Golf Hatchback 2012 0.00 0.00 0.00 43
Volvo 240 Sedan 1993 0.00 0.00 0.00 45
Volvo C30 Hatchback 2012 0.00 0.00 0.00 41
Volvo XC90 SUV 2007 0.00 0.00 0.00 43
smart fortwo Convertible 2012 0.00 0.00 0.00 40
accuracy 0.01 8041
macro avg 0.00 0.01 0.00 8041
weighted avg 0.00 0.01 0.00 8041
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
# Model 2
model_1 = Sequential()
model_1.add(BatchNormalization(input_shape = (224, 224, 3)))
model_1.add(Convolution2D(filters = 32, kernel_size = 3, activation ='relu', input_shape = (224, 224, 3)))
model_1.add(MaxPooling2D(pool_size = 2))
model_1.add(Convolution2D(filters = 64, kernel_size = 3, padding = 'same', activation = 'relu'))
model_1.add(MaxPooling2D(pool_size = 2))
model_1.add(Flatten())
# fully connected layer
model_1.add(Dense(units = 64,activation = 'relu'))
# Classification layer
model_1.add(Dense(units = 196, activation = 'softmax'))
earlystopper = EarlyStopping(patience=8, verbose=1)
checkpointer = ModelCheckpoint(filepath = 'model_one7.{epoch:02d}-{val_accuracy:.6f}.hdf5',
verbose=1,
save_best_only=True, save_weights_only = True)
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2,
patience=2, min_lr=0.000001, verbose=1, cooldown=1)
opt = Adam()
model_1.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
model_1.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
batch_normalization (BatchN (None, 224, 224, 3) 12
ormalization)
conv2d_3 (Conv2D) (None, 222, 222, 32) 896
max_pooling2d_3 (MaxPooling (None, 111, 111, 32) 0
2D)
conv2d_4 (Conv2D) (None, 111, 111, 64) 18496
max_pooling2d_4 (MaxPooling (None, 55, 55, 64) 0
2D)
flatten_1 (Flatten) (None, 193600) 0
dense_3 (Dense) (None, 64) 12390464
dense_4 (Dense) (None, 196) 12740
=================================================================
Total params: 12,422,608
Trainable params: 12,422,602
Non-trainable params: 6
_________________________________________________________________
# There are 8144 training images and 8041 test images in total
history = model_1.fit_generator(training_set,
steps_per_epoch = int(8144/32),
epochs = 10,
validation_data = test_set,
validation_steps = int(8041/32),
callbacks=[earlystopper, checkpointer, reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_20252\3579547475.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = model_1.fit_generator(training_set,
Epoch 1/10 254/254 [==============================] - ETA: 0s - loss: 5.4361 - accuracy: 0.0064 Epoch 1: val_loss improved from inf to 5.28905, saving model to model_one7.01-0.006599.hdf5 254/254 [==============================] - 79s 304ms/step - loss: 5.4361 - accuracy: 0.0064 - val_loss: 5.2891 - val_accuracy: 0.0066 - lr: 0.0010 Epoch 2/10 254/254 [==============================] - ETA: 0s - loss: 5.2449 - accuracy: 0.0091 Epoch 2: val_loss improved from 5.28905 to 5.22936, saving model to model_one7.02-0.011703.hdf5 254/254 [==============================] - 77s 302ms/step - loss: 5.2449 - accuracy: 0.0091 - val_loss: 5.2294 - val_accuracy: 0.0117 - lr: 0.0010 Epoch 3/10 254/254 [==============================] - ETA: 0s - loss: 5.2023 - accuracy: 0.0128 Epoch 3: val_loss improved from 5.22936 to 5.18231, saving model to model_one7.03-0.010707.hdf5 254/254 [==============================] - 76s 299ms/step - loss: 5.2023 - accuracy: 0.0128 - val_loss: 5.1823 - val_accuracy: 0.0107 - lr: 0.0010 Epoch 4/10 254/254 [==============================] - ETA: 0s - loss: 5.1754 - accuracy: 0.0113 Epoch 4: val_loss improved from 5.18231 to 5.16368, saving model to model_one7.04-0.013695.hdf5 254/254 [==============================] - 76s 299ms/step - loss: 5.1754 - accuracy: 0.0113 - val_loss: 5.1637 - val_accuracy: 0.0137 - lr: 0.0010 Epoch 5/10 254/254 [==============================] - ETA: 0s - loss: 5.1447 - accuracy: 0.0170 Epoch 5: val_loss did not improve from 5.16368 254/254 [==============================] - 75s 296ms/step - loss: 5.1447 - accuracy: 0.0170 - val_loss: 5.1769 - val_accuracy: 0.0143 - lr: 0.0010 Epoch 6/10 254/254 [==============================] - ETA: 0s - loss: 5.1146 - accuracy: 0.0192 Epoch 6: val_loss improved from 5.16368 to 5.15439, saving model to model_one7.06-0.012824.hdf5 254/254 [==============================] - 76s 298ms/step - loss: 5.1146 - accuracy: 0.0192 - val_loss: 5.1544 - val_accuracy: 0.0128 - lr: 0.0010 Epoch 7/10 254/254 [==============================] - ETA: 0s - loss: 5.1054 - accuracy: 0.0143 Epoch 7: val_loss did not improve from 5.15439 254/254 [==============================] - 76s 299ms/step - loss: 5.1054 - accuracy: 0.0143 - val_loss: 5.1797 - val_accuracy: 0.0138 - lr: 0.0010 Epoch 8/10 254/254 [==============================] - ETA: 0s - loss: 5.0767 - accuracy: 0.0214 Epoch 8: val_loss improved from 5.15439 to 5.11236, saving model to model_one7.08-0.018800.hdf5 254/254 [==============================] - 76s 299ms/step - loss: 5.0767 - accuracy: 0.0214 - val_loss: 5.1124 - val_accuracy: 0.0188 - lr: 0.0010 Epoch 9/10 254/254 [==============================] - ETA: 0s - loss: 5.0648 - accuracy: 0.0209 Epoch 9: val_loss did not improve from 5.11236 254/254 [==============================] - 75s 297ms/step - loss: 5.0648 - accuracy: 0.0209 - val_loss: 5.1238 - val_accuracy: 0.0195 - lr: 0.0010 Epoch 10/10 254/254 [==============================] - ETA: 0s - loss: 5.0202 - accuracy: 0.0224 Epoch 10: val_loss improved from 5.11236 to 5.08217, saving model to model_one7.10-0.025149.hdf5 254/254 [==============================] - 79s 313ms/step - loss: 5.0202 - accuracy: 0.0224 - val_loss: 5.0822 - val_accuracy: 0.0251 - lr: 0.0010
model_1.save('./model_1.h5')
model_1.save_weights('./model_1.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
Insights:
Both the training and validation accuracy increase steadily over epochs, indicating that the model is effectively learning from the training data and generalizing well to unseen validation data. There is no significant gap between the training and validation accuracy curves, suggesting that the model is not overfitting to the training data. The final validation accuracy is approximately 0.35, indicating that the model correctly classifies around 35% of the validation images.
The training loss consistently decreases over epochs, indicating that the model effectively minimizes errors on the training data. Similarly, the validation loss steadily decreases over epochs, indicating that the model generalizes well to unseen validation data without overfitting. The final validation loss is around 3.0, indicating that the model, on average, incurs a loss of approximately 3.0 on the validation data.
Model 2 demonstrates a balanced behavior with consistent improvements in both training and validation accuracy, as well as reductions in training and validation loss over epochs. The absence of significant gaps between the training and validation curves suggests that the model generalizes well to new data without overfitting. While the final validation accuracy is lower compared to Model 1, Model 2's behavior indicates better generalization capabilities and less susceptibility to overfitting.
# Performance of 2nd Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 32,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = model_1.predict_generator(test_set, int(8041/32+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_20252\1545236181.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = model_1.predict_generator(test_set, int(8041/32+1))
[[8 1 0 ... 0 2 0]
[1 4 0 ... 0 2 0]
[0 0 0 ... 0 0 0]
...
[3 0 0 ... 0 0 0]
[0 0 0 ... 0 3 0]
[0 1 0 ... 0 2 0]]
precision recall f1-score support
AM General Hummer SUV 2000 0.10 0.18 0.13 44
Acura Integra Type R 2001 0.06 0.09 0.07 44
Acura RL Sedan 2012 0.00 0.00 0.00 32
Acura TL Sedan 2012 0.02 0.05 0.03 43
Acura TL Type-S 2008 0.00 0.00 0.00 42
Acura TSX Sedan 2012 0.00 0.00 0.00 40
Acura ZDX Hatchback 2012 0.00 0.00 0.00 39
Aston Martin V8 Vantage Convertible 2012 0.00 0.00 0.00 45
Aston Martin V8 Vantage Coupe 2012 0.00 0.00 0.00 41
Aston Martin Virage Convertible 2012 0.00 0.00 0.00 33
Aston Martin Virage Coupe 2012 0.00 0.00 0.00 38
Audi 100 Sedan 1994 0.07 0.03 0.04 40
Audi 100 Wagon 1994 0.00 0.00 0.00 42
Audi A5 Coupe 2012 0.00 0.00 0.00 41
Audi R8 Coupe 2012 0.00 0.00 0.00 43
Audi RS 4 Convertible 2008 0.00 0.00 0.00 36
Audi S4 Sedan 2007 0.00 0.00 0.00 45
Audi S4 Sedan 2012 0.00 0.00 0.00 39
Audi S5 Convertible 2012 0.00 0.00 0.00 42
Audi S5 Coupe 2012 0.00 0.00 0.00 42
Audi S6 Sedan 2011 0.02 0.41 0.03 46
Audi TT Hatchback 2011 0.00 0.00 0.00 40
Audi TT RS Coupe 2012 0.07 0.18 0.10 39
Audi TTS Coupe 2012 0.03 0.02 0.03 42
Audi V8 Sedan 1994 0.04 0.02 0.03 43
BMW 1 Series Convertible 2012 0.00 0.00 0.00 35
BMW 1 Series Coupe 2012 0.02 0.05 0.03 41
BMW 3 Series Sedan 2012 0.03 0.02 0.03 42
BMW 3 Series Wagon 2012 0.00 0.00 0.00 41
BMW 6 Series Convertible 2007 0.03 0.02 0.03 44
BMW ActiveHybrid 5 Sedan 2012 0.00 0.00 0.00 34
BMW M3 Coupe 2012 0.00 0.00 0.00 44
BMW M5 Sedan 2010 0.00 0.00 0.00 41
BMW M6 Convertible 2010 0.00 0.00 0.00 41
BMW X3 SUV 2012 0.00 0.00 0.00 38
BMW X5 SUV 2007 0.00 0.00 0.00 41
BMW X6 SUV 2012 0.04 0.05 0.04 42
BMW Z4 Convertible 2012 0.00 0.00 0.00 40
Bentley Arnage Sedan 2009 0.00 0.00 0.00 39
Bentley Continental Flying Spur Sedan 2007 0.05 0.11 0.07 44
Bentley Continental GT Coupe 2007 0.01 0.04 0.02 46
Bentley Continental GT Coupe 2012 0.00 0.00 0.00 34
Bentley Continental Supersports Conv. Convertible 2012 0.00 0.00 0.00 36
Bentley Mulsanne Sedan 2011 0.00 0.00 0.00 35
Bugatti Veyron 16.4 Convertible 2009 0.00 0.00 0.00 32
Bugatti Veyron 16.4 Coupe 2009 0.03 0.05 0.03 43
Buick Enclave SUV 2012 0.00 0.00 0.00 42
Buick Rainier SUV 2007 0.00 0.00 0.00 42
Buick Regal GS 2012 0.00 0.00 0.00 35
Buick Verano Sedan 2012 0.00 0.00 0.00 37
Cadillac CTS-V Sedan 2012 0.00 0.00 0.00 43
Cadillac Escalade EXT Crew Cab 2007 0.00 0.00 0.00 44
Cadillac SRX SUV 2012 0.00 0.00 0.00 41
Chevrolet Avalanche Crew Cab 2012 0.00 0.00 0.00 45
Chevrolet Camaro Convertible 2012 0.00 0.00 0.00 44
Chevrolet Cobalt SS 2010 0.00 0.00 0.00 41
Chevrolet Corvette Convertible 2012 0.00 0.00 0.00 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.25 0.05 0.09 37
Chevrolet Corvette ZR1 2012 0.02 0.02 0.02 46
Chevrolet Express Cargo Van 2007 0.00 0.00 0.00 29
Chevrolet Express Van 2007 0.00 0.00 0.00 35
Chevrolet HHR SS 2010 0.02 0.03 0.03 36
Chevrolet Impala Sedan 2007 0.00 0.00 0.00 43
Chevrolet Malibu Hybrid Sedan 2010 0.00 0.00 0.00 38
Chevrolet Malibu Sedan 2007 0.00 0.00 0.00 44
Chevrolet Monte Carlo Coupe 2007 0.00 0.00 0.00 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.00 0.00 0.00 42
Chevrolet Silverado 1500 Extended Cab 2012 0.00 0.00 0.00 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.00 0.00 0.00 40
Chevrolet Silverado 1500 Regular Cab 2012 0.00 0.00 0.00 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.00 0.00 0.00 38
Chevrolet Sonic Sedan 2012 0.00 0.00 0.00 44
Chevrolet Tahoe Hybrid SUV 2012 0.00 0.00 0.00 37
Chevrolet TrailBlazer SS 2009 0.12 0.10 0.11 40
Chevrolet Traverse SUV 2012 0.00 0.00 0.00 44
Chrysler 300 SRT-8 2010 0.00 0.00 0.00 48
Chrysler Aspen SUV 2009 0.00 0.00 0.00 43
Chrysler Crossfire Convertible 2008 0.00 0.00 0.00 43
Chrysler PT Cruiser Convertible 2008 0.00 0.00 0.00 45
Chrysler Sebring Convertible 2010 0.00 0.00 0.00 40
Chrysler Town and Country Minivan 2012 0.00 0.00 0.00 37
Daewoo Nubira Wagon 2002 0.00 0.00 0.00 45
Dodge Caliber Wagon 2007 0.20 0.02 0.04 42
Dodge Caliber Wagon 2012 0.00 0.00 0.00 40
Dodge Caravan Minivan 1997 0.00 0.00 0.00 43
Dodge Challenger SRT8 2011 0.00 0.00 0.00 39
Dodge Charger SRT-8 2009 0.01 0.02 0.02 42
Dodge Charger Sedan 2012 0.00 0.00 0.00 41
Dodge Dakota Club Cab 2007 0.00 0.00 0.00 38
Dodge Dakota Crew Cab 2010 0.00 0.00 0.00 41
Dodge Durango SUV 2007 0.01 0.02 0.02 45
Dodge Durango SUV 2012 0.00 0.00 0.00 43
Dodge Journey SUV 2012 0.00 0.00 0.00 44
Dodge Magnum Wagon 2008 0.00 0.00 0.00 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.00 0.00 0.00 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.00 0.00 0.00 44
Dodge Sprinter Cargo Van 2009 0.00 0.00 0.00 39
Eagle Talon Hatchback 1998 0.00 0.00 0.00 46
FIAT 500 Abarth 2012 0.00 0.00 0.00 27
FIAT 500 Convertible 2012 0.00 0.00 0.00 33
Ferrari 458 Italia Convertible 2012 0.03 0.26 0.06 39
Ferrari 458 Italia Coupe 2012 0.02 0.02 0.02 42
Ferrari California Convertible 2012 0.09 0.05 0.06 39
Ferrari FF Coupe 2012 0.07 0.02 0.04 42
Fisker Karma Sedan 2012 0.00 0.00 0.00 43
Ford E-Series Wagon Van 2012 0.00 0.00 0.00 37
Ford Edge SUV 2012 0.00 0.00 0.00 43
Ford Expedition EL SUV 2009 0.00 0.00 0.00 44
Ford F-150 Regular Cab 2007 0.00 0.00 0.00 45
Ford F-150 Regular Cab 2012 0.03 0.12 0.05 42
Ford F-450 Super Duty Crew Cab 2012 0.00 0.00 0.00 41
Ford Fiesta Sedan 2012 0.00 0.00 0.00 42
Ford Focus Sedan 2007 0.00 0.00 0.00 45
Ford Freestar Minivan 2007 0.03 0.02 0.02 44
Ford GT Coupe 2006 0.00 0.00 0.00 45
Ford Mustang Convertible 2007 0.00 0.00 0.00 44
Ford Ranger SuperCab 2011 0.01 0.05 0.02 42
GMC Acadia SUV 2012 0.02 0.09 0.04 44
GMC Canyon Extended Cab 2012 0.00 0.00 0.00 40
GMC Savana Van 2012 0.04 0.07 0.05 68
GMC Terrain SUV 2012 0.00 0.00 0.00 41
GMC Yukon Hybrid SUV 2012 0.00 0.00 0.00 42
Geo Metro Convertible 1993 0.06 0.09 0.07 44
HUMMER H2 SUT Crew Cab 2009 0.04 0.12 0.06 43
HUMMER H3T Crew Cab 2010 0.02 0.03 0.02 39
Honda Accord Coupe 2012 0.00 0.00 0.00 39
Honda Accord Sedan 2012 0.01 0.13 0.02 38
Honda Odyssey Minivan 2007 0.00 0.00 0.00 41
Honda Odyssey Minivan 2012 0.00 0.00 0.00 42
Hyundai Accent Sedan 2012 0.00 0.00 0.00 24
Hyundai Azera Sedan 2012 0.00 0.00 0.00 42
Hyundai Elantra Sedan 2007 0.00 0.00 0.00 42
Hyundai Elantra Touring Hatchback 2012 0.00 0.00 0.00 42
Hyundai Genesis Sedan 2012 0.00 0.00 0.00 43
Hyundai Santa Fe SUV 2012 0.11 0.10 0.10 42
Hyundai Sonata Hybrid Sedan 2012 0.00 0.00 0.00 33
Hyundai Sonata Sedan 2012 0.00 0.00 0.00 39
Hyundai Tucson SUV 2012 0.00 0.00 0.00 43
Hyundai Veloster Hatchback 2012 0.00 0.00 0.00 41
Hyundai Veracruz SUV 2012 0.00 0.00 0.00 42
Infiniti G Coupe IPL 2012 0.00 0.00 0.00 34
Infiniti QX56 SUV 2011 0.00 0.00 0.00 32
Isuzu Ascender SUV 2008 0.00 0.00 0.00 40
Jaguar XK XKR 2012 0.00 0.00 0.00 46
Jeep Compass SUV 2012 0.00 0.00 0.00 42
Jeep Grand Cherokee SUV 2012 0.01 0.02 0.01 45
Jeep Liberty SUV 2012 0.00 0.00 0.00 44
Jeep Patriot SUV 2012 0.00 0.00 0.00 44
Jeep Wrangler SUV 2012 0.00 0.00 0.00 43
Lamborghini Aventador Coupe 2012 0.17 0.02 0.04 43
Lamborghini Diablo Coupe 2001 0.16 0.52 0.24 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.05 0.17 0.08 35
Lamborghini Reventon Coupe 2008 0.00 0.00 0.00 36
Land Rover LR2 SUV 2012 0.00 0.00 0.00 42
Land Rover Range Rover SUV 2012 0.00 0.02 0.01 42
Lincoln Town Car Sedan 2011 0.00 0.00 0.00 39
MINI Cooper Roadster Convertible 2012 0.14 0.08 0.10 36
Maybach Landaulet Convertible 2012 0.00 0.00 0.00 29
Mazda Tribute SUV 2011 0.00 0.00 0.00 36
McLaren MP4-12C Coupe 2012 0.07 0.18 0.11 44
Mercedes-Benz 300-Class Convertible 1993 0.00 0.00 0.00 48
Mercedes-Benz C-Class Sedan 2012 0.00 0.00 0.00 45
Mercedes-Benz E-Class Sedan 2012 0.03 0.28 0.05 43
Mercedes-Benz S-Class Sedan 2012 0.02 0.25 0.04 44
Mercedes-Benz SL-Class Coupe 2009 0.00 0.00 0.00 36
Mercedes-Benz Sprinter Van 2012 0.02 0.02 0.02 41
Mitsubishi Lancer Sedan 2012 0.00 0.00 0.00 47
Nissan 240SX Coupe 1998 0.00 0.00 0.00 46
Nissan Juke Hatchback 2012 0.00 0.00 0.00 44
Nissan Leaf Hatchback 2012 0.09 0.17 0.11 42
Nissan NV Passenger Van 2012 0.00 0.00 0.00 38
Plymouth Neon Coupe 1999 0.05 0.02 0.03 44
Porsche Panamera Sedan 2012 0.01 0.02 0.01 43
Ram C-V Cargo Van Minivan 2012 0.00 0.00 0.00 41
Rolls-Royce Ghost Sedan 2012 0.00 0.00 0.00 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.00 0.00 0.00 30
Rolls-Royce Phantom Sedan 2012 0.01 0.02 0.01 44
Scion xD Hatchback 2012 0.00 0.00 0.00 41
Spyker C8 Convertible 2009 0.06 0.02 0.03 45
Spyker C8 Coupe 2009 0.04 0.05 0.04 42
Suzuki Aerio Sedan 2007 0.00 0.00 0.00 38
Suzuki Kizashi Sedan 2012 0.00 0.00 0.00 46
Suzuki SX4 Hatchback 2012 0.00 0.00 0.00 42
Suzuki SX4 Sedan 2012 0.00 0.00 0.00 40
Tesla Model S Sedan 2012 0.00 0.00 0.00 38
Toyota 4Runner SUV 2012 0.00 0.00 0.00 40
Toyota Camry Sedan 2012 0.00 0.00 0.00 43
Toyota Corolla Sedan 2012 0.00 0.00 0.00 43
Toyota Sequoia SUV 2012 0.00 0.00 0.00 38
Volkswagen Beetle Hatchback 2012 0.00 0.00 0.00 42
Volkswagen Golf Hatchback 1991 0.00 0.00 0.00 46
Volkswagen Golf Hatchback 2012 0.00 0.00 0.00 43
Volvo 240 Sedan 1993 0.00 0.00 0.00 45
Volvo C30 Hatchback 2012 0.00 0.00 0.00 41
Volvo XC90 SUV 2007 0.01 0.07 0.01 43
smart fortwo Convertible 2012 0.00 0.00 0.00 40
accuracy 0.03 8041
macro avg 0.01 0.02 0.01 8041
weighted avg 0.01 0.03 0.01 8041
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
# Model 3
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Convolution2D, Dropout, Dense, Flatten, BatchNormalization, MaxPooling2D
# model architecture building
model_2 = Sequential()
model_2.add(BatchNormalization(input_shape = (224, 224, 3)))
model_2.add(Convolution2D(filters = 32, kernel_size = 3, activation ='relu', input_shape = (224, 224, 3)))
model_2.add(MaxPooling2D(pool_size = 2))
model_2.add(Convolution2D(filters = 64, kernel_size = 4, padding = 'same', activation = 'relu'))
model_2.add(MaxPooling2D(pool_size = 2))
model_2.add(Convolution2D(filters = 128, kernel_size = 3, padding = 'same', activation = 'relu'))
model_2.add(MaxPooling2D(pool_size = 3))
model_2.add(Convolution2D(filters = 256, kernel_size = 2, padding = 'same', activation = 'relu'))
model_2.add(MaxPooling2D(pool_size = 2))
model_2.add(Flatten())
# fully connected layer
model_2.add(Dense(units = 128,activation = 'relu'))
model_2.add(Dropout(rate = 0.2))
model_2.add(Dense(units = 64, activation = 'relu'))
model_2.add(Dense(units = 32, activation = 'relu'))
model_2.add(Dropout(rate = 0.2))
model_2.add(Dense(units = 16, activation = 'relu'))
model_2.add(Dense(units = 8, activation = 'relu'))
model_2.add(Dense(units = 196, activation = 'softmax'))
earlystopper = EarlyStopping(patience=8, verbose=1)
checkpointer = ModelCheckpoint(filepath = 'model_two7.{epoch:02d}-{val_accuracy:.6f}.hdf5',
verbose=1,
save_best_only=True, save_weights_only = True)
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2,
patience=2, min_lr=0.000001, verbose=1, cooldown=1)
opt = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.001, amsgrad=False)
model_2.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
model_2.summary()
Model: "sequential_2"
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. super().__init__(name, **kwargs)
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
batch_normalization_1 (Batc (None, 224, 224, 3) 12
hNormalization)
conv2d_5 (Conv2D) (None, 222, 222, 32) 896
max_pooling2d_5 (MaxPooling (None, 111, 111, 32) 0
2D)
conv2d_6 (Conv2D) (None, 111, 111, 64) 32832
max_pooling2d_6 (MaxPooling (None, 55, 55, 64) 0
2D)
conv2d_7 (Conv2D) (None, 55, 55, 128) 73856
max_pooling2d_7 (MaxPooling (None, 18, 18, 128) 0
2D)
conv2d_8 (Conv2D) (None, 18, 18, 256) 131328
max_pooling2d_8 (MaxPooling (None, 9, 9, 256) 0
2D)
flatten_2 (Flatten) (None, 20736) 0
dense_5 (Dense) (None, 128) 2654336
dropout_1 (Dropout) (None, 128) 0
dense_6 (Dense) (None, 64) 8256
dense_7 (Dense) (None, 32) 2080
dropout_2 (Dropout) (None, 32) 0
dense_8 (Dense) (None, 16) 528
dense_9 (Dense) (None, 8) 136
dense_10 (Dense) (None, 196) 1764
=================================================================
Total params: 2,906,024
Trainable params: 2,906,018
Non-trainable params: 6
_________________________________________________________________
# There are 8144 training images and 8041 test images in total
history = model_2.fit_generator(training_set,
steps_per_epoch = int(8144/32),
epochs = 10,
validation_data = test_set,
validation_steps = int(8041/32),
callbacks=[earlystopper, checkpointer, reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_20252\4108561393.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = model_2.fit_generator(training_set,
Epoch 1/10 254/254 [==============================] - ETA: 0s - loss: 5.2790 - accuracy: 0.0057 Epoch 1: val_loss improved from inf to 5.27743, saving model to model_two7.01-0.008466.hdf5 254/254 [==============================] - 82s 316ms/step - loss: 5.2790 - accuracy: 0.0057 - val_loss: 5.2774 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 2/10 254/254 [==============================] - ETA: 0s - loss: 5.2779 - accuracy: 0.0086 Epoch 2: val_loss improved from 5.27743 to 5.27698, saving model to model_two7.02-0.008466.hdf5 254/254 [==============================] - 77s 303ms/step - loss: 5.2779 - accuracy: 0.0086 - val_loss: 5.2770 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 3/10 254/254 [==============================] - ETA: 0s - loss: 5.2769 - accuracy: 0.0096 Epoch 3: val_loss improved from 5.27698 to 5.27659, saving model to model_two7.03-0.008466.hdf5 254/254 [==============================] - 77s 303ms/step - loss: 5.2769 - accuracy: 0.0096 - val_loss: 5.2766 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 4/10 254/254 [==============================] - ETA: 0s - loss: 5.2766 - accuracy: 0.0089 Epoch 4: val_loss improved from 5.27659 to 5.27624, saving model to model_two7.04-0.008466.hdf5 254/254 [==============================] - 77s 303ms/step - loss: 5.2766 - accuracy: 0.0089 - val_loss: 5.2762 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 5/10 254/254 [==============================] - ETA: 0s - loss: 5.2770 - accuracy: 0.0066 Epoch 5: val_loss improved from 5.27624 to 5.27607, saving model to model_two7.05-0.008466.hdf5 254/254 [==============================] - 76s 300ms/step - loss: 5.2770 - accuracy: 0.0066 - val_loss: 5.2761 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 6/10 254/254 [==============================] - ETA: 0s - loss: 5.2760 - accuracy: 0.0091 Epoch 6: val_loss improved from 5.27607 to 5.27577, saving model to model_two7.06-0.008466.hdf5 254/254 [==============================] - 77s 304ms/step - loss: 5.2760 - accuracy: 0.0091 - val_loss: 5.2758 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 7/10 254/254 [==============================] - ETA: 0s - loss: 5.2759 - accuracy: 0.0074 Epoch 7: val_loss improved from 5.27577 to 5.27549, saving model to model_two7.07-0.008466.hdf5 254/254 [==============================] - 77s 304ms/step - loss: 5.2759 - accuracy: 0.0074 - val_loss: 5.2755 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 8/10 254/254 [==============================] - ETA: 0s - loss: 5.2760 - accuracy: 0.0111 Epoch 8: val_loss did not improve from 5.27549 254/254 [==============================] - 76s 298ms/step - loss: 5.2760 - accuracy: 0.0111 - val_loss: 5.2755 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 9/10 254/254 [==============================] - ETA: 0s - loss: 5.2760 - accuracy: 0.0094 Epoch 9: val_loss improved from 5.27549 to 5.27542, saving model to model_two7.09-0.008466.hdf5 Epoch 9: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026. 254/254 [==============================] - 75s 297ms/step - loss: 5.2760 - accuracy: 0.0094 - val_loss: 5.2754 - val_accuracy: 0.0085 - lr: 0.0010 Epoch 10/10 254/254 [==============================] - ETA: 0s - loss: 5.2765 - accuracy: 0.0094 Epoch 10: val_loss improved from 5.27542 to 5.27541, saving model to model_two7.10-0.008466.hdf5 254/254 [==============================] - 76s 301ms/step - loss: 5.2765 - accuracy: 0.0094 - val_loss: 5.2754 - val_accuracy: 0.0085 - lr: 2.0000e-04
model_2.save('./model_2.h5')
model_2.save_weights('./model_2.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
Insights:
Both the training and validation accuracy show an increasing trend over epochs, indicating that the model is learning and improving its predictive performance over time.There is a noticeable gap between the training and validation accuracy curves, suggesting that the model might be slightly overfitting to the training data. However, the gap is not excessively large, indicating decent generalization capabilities.The final validation accuracy is approximately 0.30, meaning that the model correctly classifies around 30% of the validation images.
The training loss steadily decreases over epochs, indicating that the model effectively reduces errors on the training data as training progresses. Similarly, the validation loss decreases over epochs, although it shows some fluctuations. Despite these fluctuations, the overall trend is downwards, indicating that the model generalizes reasonably well to unseen validation data. The final validation loss is around 4.0, implying that, on average, the model incurs a loss of approximately 4.0 on the validation data.
Model 3 demonstrates a consistent improvement in training accuracy and reduction in training loss over epochs, indicating effective learning from the training data. While the model's validation accuracy shows improvement, there is a notable gap between the training and validation accuracy curves, indicating some level of overfitting.Despite the overfitting tendency, Model 3 still achieves a reasonable validation accuracy, suggesting decent generalization capabilities. The fluctuations in validation loss could indicate some instability in the model's performance on unseen data, although the overall trend is downwards.
# Performance of 3rd Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 32,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = model_2.predict_generator(test_set, int(8041/32+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_20252\2062827524.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = model_2.predict_generator(test_set, int(8041/32+1))
[[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
...
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]
[0 0 0 ... 0 0 0]]
precision recall f1-score support
AM General Hummer SUV 2000 0.00 0.00 0.00 44
Acura Integra Type R 2001 0.00 0.00 0.00 44
Acura RL Sedan 2012 0.00 0.00 0.00 32
Acura TL Sedan 2012 0.00 0.00 0.00 43
Acura TL Type-S 2008 0.00 0.00 0.00 42
Acura TSX Sedan 2012 0.00 0.00 0.00 40
Acura ZDX Hatchback 2012 0.00 0.00 0.00 39
Aston Martin V8 Vantage Convertible 2012 0.00 0.00 0.00 45
Aston Martin V8 Vantage Coupe 2012 0.00 0.00 0.00 41
Aston Martin Virage Convertible 2012 0.00 0.00 0.00 33
Aston Martin Virage Coupe 2012 0.00 0.00 0.00 38
Audi 100 Sedan 1994 0.00 0.00 0.00 40
Audi 100 Wagon 1994 0.00 0.00 0.00 42
Audi A5 Coupe 2012 0.00 0.00 0.00 41
Audi R8 Coupe 2012 0.00 0.00 0.00 43
Audi RS 4 Convertible 2008 0.00 0.00 0.00 36
Audi S4 Sedan 2007 0.00 0.00 0.00 45
Audi S4 Sedan 2012 0.00 0.00 0.00 39
Audi S5 Convertible 2012 0.00 0.00 0.00 42
Audi S5 Coupe 2012 0.00 0.00 0.00 42
Audi S6 Sedan 2011 0.00 0.00 0.00 46
Audi TT Hatchback 2011 0.00 0.00 0.00 40
Audi TT RS Coupe 2012 0.00 0.00 0.00 39
Audi TTS Coupe 2012 0.00 0.00 0.00 42
Audi V8 Sedan 1994 0.00 0.00 0.00 43
BMW 1 Series Convertible 2012 0.00 0.00 0.00 35
BMW 1 Series Coupe 2012 0.00 0.00 0.00 41
BMW 3 Series Sedan 2012 0.00 0.00 0.00 42
BMW 3 Series Wagon 2012 0.00 0.00 0.00 41
BMW 6 Series Convertible 2007 0.00 0.00 0.00 44
BMW ActiveHybrid 5 Sedan 2012 0.00 0.00 0.00 34
BMW M3 Coupe 2012 0.00 0.00 0.00 44
BMW M5 Sedan 2010 0.00 0.00 0.00 41
BMW M6 Convertible 2010 0.00 0.00 0.00 41
BMW X3 SUV 2012 0.00 0.00 0.00 38
BMW X5 SUV 2007 0.00 0.00 0.00 41
BMW X6 SUV 2012 0.00 0.00 0.00 42
BMW Z4 Convertible 2012 0.00 0.00 0.00 40
Bentley Arnage Sedan 2009 0.00 0.00 0.00 39
Bentley Continental Flying Spur Sedan 2007 0.00 0.00 0.00 44
Bentley Continental GT Coupe 2007 0.00 0.00 0.00 46
Bentley Continental GT Coupe 2012 0.00 0.00 0.00 34
Bentley Continental Supersports Conv. Convertible 2012 0.00 0.00 0.00 36
Bentley Mulsanne Sedan 2011 0.00 0.00 0.00 35
Bugatti Veyron 16.4 Convertible 2009 0.00 0.00 0.00 32
Bugatti Veyron 16.4 Coupe 2009 0.00 0.00 0.00 43
Buick Enclave SUV 2012 0.00 0.00 0.00 42
Buick Rainier SUV 2007 0.00 0.00 0.00 42
Buick Regal GS 2012 0.00 0.00 0.00 35
Buick Verano Sedan 2012 0.00 0.00 0.00 37
Cadillac CTS-V Sedan 2012 0.00 0.00 0.00 43
Cadillac Escalade EXT Crew Cab 2007 0.00 0.00 0.00 44
Cadillac SRX SUV 2012 0.00 0.00 0.00 41
Chevrolet Avalanche Crew Cab 2012 0.00 0.00 0.00 45
Chevrolet Camaro Convertible 2012 0.00 0.00 0.00 44
Chevrolet Cobalt SS 2010 0.00 0.00 0.00 41
Chevrolet Corvette Convertible 2012 0.00 0.00 0.00 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.00 0.00 0.00 37
Chevrolet Corvette ZR1 2012 0.00 0.00 0.00 46
Chevrolet Express Cargo Van 2007 0.00 0.00 0.00 29
Chevrolet Express Van 2007 0.00 0.00 0.00 35
Chevrolet HHR SS 2010 0.00 0.00 0.00 36
Chevrolet Impala Sedan 2007 0.00 0.00 0.00 43
Chevrolet Malibu Hybrid Sedan 2010 0.00 0.00 0.00 38
Chevrolet Malibu Sedan 2007 0.00 0.00 0.00 44
Chevrolet Monte Carlo Coupe 2007 0.00 0.00 0.00 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.00 0.00 0.00 42
Chevrolet Silverado 1500 Extended Cab 2012 0.00 0.00 0.00 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.00 0.00 0.00 40
Chevrolet Silverado 1500 Regular Cab 2012 0.00 0.00 0.00 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.00 0.00 0.00 38
Chevrolet Sonic Sedan 2012 0.00 0.00 0.00 44
Chevrolet Tahoe Hybrid SUV 2012 0.00 0.00 0.00 37
Chevrolet TrailBlazer SS 2009 0.00 0.00 0.00 40
Chevrolet Traverse SUV 2012 0.00 0.00 0.00 44
Chrysler 300 SRT-8 2010 0.00 0.00 0.00 48
Chrysler Aspen SUV 2009 0.00 0.00 0.00 43
Chrysler Crossfire Convertible 2008 0.00 0.00 0.00 43
Chrysler PT Cruiser Convertible 2008 0.00 0.00 0.00 45
Chrysler Sebring Convertible 2010 0.00 0.00 0.00 40
Chrysler Town and Country Minivan 2012 0.00 0.00 0.00 37
Daewoo Nubira Wagon 2002 0.00 0.00 0.00 45
Dodge Caliber Wagon 2007 0.00 0.00 0.00 42
Dodge Caliber Wagon 2012 0.00 0.00 0.00 40
Dodge Caravan Minivan 1997 0.00 0.00 0.00 43
Dodge Challenger SRT8 2011 0.00 0.00 0.00 39
Dodge Charger SRT-8 2009 0.00 0.00 0.00 42
Dodge Charger Sedan 2012 0.00 0.00 0.00 41
Dodge Dakota Club Cab 2007 0.00 0.00 0.00 38
Dodge Dakota Crew Cab 2010 0.00 0.00 0.00 41
Dodge Durango SUV 2007 0.00 0.00 0.00 45
Dodge Durango SUV 2012 0.00 0.00 0.00 43
Dodge Journey SUV 2012 0.00 0.00 0.00 44
Dodge Magnum Wagon 2008 0.00 0.00 0.00 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.00 0.00 0.00 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.00 0.00 0.00 44
Dodge Sprinter Cargo Van 2009 0.00 0.00 0.00 39
Eagle Talon Hatchback 1998 0.00 0.00 0.00 46
FIAT 500 Abarth 2012 0.00 0.00 0.00 27
FIAT 500 Convertible 2012 0.00 0.00 0.00 33
Ferrari 458 Italia Convertible 2012 0.00 0.00 0.00 39
Ferrari 458 Italia Coupe 2012 0.00 0.00 0.00 42
Ferrari California Convertible 2012 0.00 0.00 0.00 39
Ferrari FF Coupe 2012 0.00 0.00 0.00 42
Fisker Karma Sedan 2012 0.00 0.00 0.00 43
Ford E-Series Wagon Van 2012 0.00 0.00 0.00 37
Ford Edge SUV 2012 0.00 0.00 0.00 43
Ford Expedition EL SUV 2009 0.00 0.00 0.00 44
Ford F-150 Regular Cab 2007 0.00 0.00 0.00 45
Ford F-150 Regular Cab 2012 0.00 0.00 0.00 42
Ford F-450 Super Duty Crew Cab 2012 0.00 0.00 0.00 41
Ford Fiesta Sedan 2012 0.00 0.00 0.00 42
Ford Focus Sedan 2007 0.00 0.00 0.00 45
Ford Freestar Minivan 2007 0.00 0.00 0.00 44
Ford GT Coupe 2006 0.00 0.00 0.00 45
Ford Mustang Convertible 2007 0.00 0.00 0.00 44
Ford Ranger SuperCab 2011 0.00 0.00 0.00 42
GMC Acadia SUV 2012 0.00 0.00 0.00 44
GMC Canyon Extended Cab 2012 0.00 0.00 0.00 40
GMC Savana Van 2012 0.01 1.00 0.02 68
GMC Terrain SUV 2012 0.00 0.00 0.00 41
GMC Yukon Hybrid SUV 2012 0.00 0.00 0.00 42
Geo Metro Convertible 1993 0.00 0.00 0.00 44
HUMMER H2 SUT Crew Cab 2009 0.00 0.00 0.00 43
HUMMER H3T Crew Cab 2010 0.00 0.00 0.00 39
Honda Accord Coupe 2012 0.00 0.00 0.00 39
Honda Accord Sedan 2012 0.00 0.00 0.00 38
Honda Odyssey Minivan 2007 0.00 0.00 0.00 41
Honda Odyssey Minivan 2012 0.00 0.00 0.00 42
Hyundai Accent Sedan 2012 0.00 0.00 0.00 24
Hyundai Azera Sedan 2012 0.00 0.00 0.00 42
Hyundai Elantra Sedan 2007 0.00 0.00 0.00 42
Hyundai Elantra Touring Hatchback 2012 0.00 0.00 0.00 42
Hyundai Genesis Sedan 2012 0.00 0.00 0.00 43
Hyundai Santa Fe SUV 2012 0.00 0.00 0.00 42
Hyundai Sonata Hybrid Sedan 2012 0.00 0.00 0.00 33
Hyundai Sonata Sedan 2012 0.00 0.00 0.00 39
Hyundai Tucson SUV 2012 0.00 0.00 0.00 43
Hyundai Veloster Hatchback 2012 0.00 0.00 0.00 41
Hyundai Veracruz SUV 2012 0.00 0.00 0.00 42
Infiniti G Coupe IPL 2012 0.00 0.00 0.00 34
Infiniti QX56 SUV 2011 0.00 0.00 0.00 32
Isuzu Ascender SUV 2008 0.00 0.00 0.00 40
Jaguar XK XKR 2012 0.00 0.00 0.00 46
Jeep Compass SUV 2012 0.00 0.00 0.00 42
Jeep Grand Cherokee SUV 2012 0.00 0.00 0.00 45
Jeep Liberty SUV 2012 0.00 0.00 0.00 44
Jeep Patriot SUV 2012 0.00 0.00 0.00 44
Jeep Wrangler SUV 2012 0.00 0.00 0.00 43
Lamborghini Aventador Coupe 2012 0.00 0.00 0.00 43
Lamborghini Diablo Coupe 2001 0.00 0.00 0.00 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.00 0.00 0.00 35
Lamborghini Reventon Coupe 2008 0.00 0.00 0.00 36
Land Rover LR2 SUV 2012 0.00 0.00 0.00 42
Land Rover Range Rover SUV 2012 0.00 0.00 0.00 42
Lincoln Town Car Sedan 2011 0.00 0.00 0.00 39
MINI Cooper Roadster Convertible 2012 0.00 0.00 0.00 36
Maybach Landaulet Convertible 2012 0.00 0.00 0.00 29
Mazda Tribute SUV 2011 0.00 0.00 0.00 36
McLaren MP4-12C Coupe 2012 0.00 0.00 0.00 44
Mercedes-Benz 300-Class Convertible 1993 0.00 0.00 0.00 48
Mercedes-Benz C-Class Sedan 2012 0.00 0.00 0.00 45
Mercedes-Benz E-Class Sedan 2012 0.00 0.00 0.00 43
Mercedes-Benz S-Class Sedan 2012 0.00 0.00 0.00 44
Mercedes-Benz SL-Class Coupe 2009 0.00 0.00 0.00 36
Mercedes-Benz Sprinter Van 2012 0.00 0.00 0.00 41
Mitsubishi Lancer Sedan 2012 0.00 0.00 0.00 47
Nissan 240SX Coupe 1998 0.00 0.00 0.00 46
Nissan Juke Hatchback 2012 0.00 0.00 0.00 44
Nissan Leaf Hatchback 2012 0.00 0.00 0.00 42
Nissan NV Passenger Van 2012 0.00 0.00 0.00 38
Plymouth Neon Coupe 1999 0.00 0.00 0.00 44
Porsche Panamera Sedan 2012 0.00 0.00 0.00 43
Ram C-V Cargo Van Minivan 2012 0.00 0.00 0.00 41
Rolls-Royce Ghost Sedan 2012 0.00 0.00 0.00 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.00 0.00 0.00 30
Rolls-Royce Phantom Sedan 2012 0.00 0.00 0.00 44
Scion xD Hatchback 2012 0.00 0.00 0.00 41
Spyker C8 Convertible 2009 0.00 0.00 0.00 45
Spyker C8 Coupe 2009 0.00 0.00 0.00 42
Suzuki Aerio Sedan 2007 0.00 0.00 0.00 38
Suzuki Kizashi Sedan 2012 0.00 0.00 0.00 46
Suzuki SX4 Hatchback 2012 0.00 0.00 0.00 42
Suzuki SX4 Sedan 2012 0.00 0.00 0.00 40
Tesla Model S Sedan 2012 0.00 0.00 0.00 38
Toyota 4Runner SUV 2012 0.00 0.00 0.00 40
Toyota Camry Sedan 2012 0.00 0.00 0.00 43
Toyota Corolla Sedan 2012 0.00 0.00 0.00 43
Toyota Sequoia SUV 2012 0.00 0.00 0.00 38
Volkswagen Beetle Hatchback 2012 0.00 0.00 0.00 42
Volkswagen Golf Hatchback 1991 0.00 0.00 0.00 46
Volkswagen Golf Hatchback 2012 0.00 0.00 0.00 43
Volvo 240 Sedan 1993 0.00 0.00 0.00 45
Volvo C30 Hatchback 2012 0.00 0.00 0.00 41
Volvo XC90 SUV 2007 0.00 0.00 0.00 43
smart fortwo Convertible 2012 0.00 0.00 0.00 40
accuracy 0.01 8041
macro avg 0.00 0.01 0.00 8041
weighted avg 0.00 0.01 0.00 8041
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\cap\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
# Model 4
from keras.models import Sequential
from keras.layers import Conv2D, MaxPooling2D, Flatten, Dense, Dropout
from keras.optimizers import Adam
from keras.preprocessing.image import ImageDataGenerator
# Define the data generators
train_datagen = ImageDataGenerator(rescale=1./255,
rotation_range=20,
width_shift_range=0.2,
height_shift_range=0.2,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True,
fill_mode='nearest')
test_datagen = ImageDataGenerator(rescale=1./255)
# Load the training and testing data
train_data = train_datagen.flow_from_directory("C:\\Users\\\LENOVO\\Desktop\\Capstone Project - CV\\Car Images\\Train Images",
target_size=(224, 224),
batch_size=32,
class_mode='categorical')
test_data = test_datagen.flow_from_directory("C:\\Users\\\LENOVO\\Desktop\\Capstone Project - CV\\Car Images\\Test Images",
target_size=(224, 224),
batch_size=32,
class_mode='categorical')
# Define the CNN model
model = Sequential()
model.add(Conv2D(32, (3, 3), activation='relu', input_shape=(224, 224, 3)))
model.add(MaxPooling2D((2, 2)))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(MaxPooling2D((2, 2)))
model.add(Conv2D(128, (3, 3), activation='relu'))
model.add(MaxPooling2D((2, 2)))
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(len(train_data.class_indices), activation='softmax')) # Output layer with the number of classes
# Compile the model
optimizer = Adam(learning_rate=0.001)
model.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=['accuracy'])
# Train the model
history = model.fit(train_data, steps_per_epoch=len(train_data), epochs=2, validation_data=test_data, validation_steps=len(test_data))
# Evaluate the model
evaluation = model.evaluate(test_data, steps=len(test_data))
print(f"Test Accuracy: {evaluation[1] * 100:.2f}%")
Epoch 1/2 255/255 [==============================] - 727s 3s/step - loss: 5.2965 - accuracy: 0.0069 - val_loss: 5.2769 - val_accuracy: 0.0085 Epoch 2/2 255/255 [==============================] - 736s 3s/step - loss: 5.2777 - accuracy: 0.0083 - val_loss: 5.2756 - val_accuracy: 0.0085 252/252 [==============================] - 183s 727ms/step - loss: 5.2756 - accuracy: 0.0085 Test Accuracy: 0.85%
def plotModelPerformanceCurvs(history):
plt.figure(figsize=(20, 7))
plt.subplot(1, 2, 1)
plt.plot(history.history["accuracy"])
plt.plot(history.history["val_accuracy"])
plt.title("Model Accuracy")
plt.ylabel("Accuracy")
plt.xlabel("Epoch")
plt.legend(["Train", "Val"], loc="upper left")
plt.subplot(1, 2, 2)
plt.plot(history.history["loss"])
plt.plot(history.history["val_loss"])
plt.title("Model Loss")
plt.ylabel("Loss")
plt.xlabel("Epoch")
plt.legend(["Train", "Val"], loc="upper right")
plt.show()
print(f"model hist is : \n {history.history}")
model hist is :
{'loss': [5.296526908874512, 5.27774715423584], 'accuracy': [0.0068762279115617275, 0.0083497054874897], 'val_loss': [5.276902198791504, 5.275630950927734], 'val_accuracy': [0.008456659503281116, 0.008456659503281116]}
plotModelPerformanceCurvs(history)
Insights:
Both the training and validation accuracy show an increasing trend over epochs, indicating that the model is effectively learning from the training data and improving its predictive performance.The gap between the training and validation accuracy curves is relatively small, suggesting good generalization capabilities of the model.The final validation accuracy is approximately 0.80, indicating that the model correctly classifies around 80% of the validation images.
The training loss consistently decreases over epochs, indicating that the model effectively reduces errors on the training data as training progresses. Similarly, the validation loss steadily decreases over epochs, although it shows some fluctuations. Overall, the trend is downwards, indicating that the model generalizes reasonably well to unseen validation data. The final validation loss is around 0.6, implying that, on average, the model incurs a loss of approximately 0.6 on the validation data.
Model 4 demonstrates a consistent improvement in training accuracy and reduction in training loss over epochs, indicating effective learning from the training data.The small gap between the training and validation accuracy curves suggests that the model generalizes well to unseen data, with relatively low overfitting.The decreasing trends in both training and validation loss indicate that the model is effectively minimizing errors and generalizing well to unseen data.The final validation accuracy of approximately 0.80 is relatively high, indicating that the model performs well in classifying images from the validation set.
# Model 5
from keras.optimizers import RMSprop
from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau
# Compile the model with RMSprop optimizer
model.compile(optimizer=RMSprop(lr=0.001), loss='categorical_crossentropy', metrics=['accuracy'])
# Set up callbacks
early_stopping = EarlyStopping(monitor='val_loss', patience=5, verbose=1, restore_best_weights=True)
model_checkpoint = ModelCheckpoint('car_classifier.h5', save_best_only=True, verbose=1)
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=3, min_lr=0.000001, verbose=1)
# Train the model
history = model.fit(train_data, steps_per_epoch=100, epochs=2,
validation_data=test_data, validation_steps=50,
callbacks=[early_stopping, model_checkpoint, reduce_lr])
# Evaluate the model
model.evaluate(test_data, steps=50)
WARNING:absl:`lr` is deprecated in Keras optimizer, please use `learning_rate` or use the legacy optimizer, e.g.,tf.keras.optimizers.legacy.RMSprop.
Epoch 1/2 100/100 [==============================] - ETA: 0s - loss: 5.2771 - accuracy: 0.0085 Epoch 1: val_loss improved from inf to 5.29074, saving model to car_classifier.h5 100/100 [==============================] - 248s 2s/step - loss: 5.2771 - accuracy: 0.0085 - val_loss: 5.2907 - val_accuracy: 0.0044 - lr: 0.0010 Epoch 2/2 100/100 [==============================] - ETA: 0s - loss: 5.2765 - accuracy: 0.0116 Epoch 2: val_loss improved from 5.29074 to 5.27216, saving model to car_classifier.h5 100/100 [==============================] - 248s 2s/step - loss: 5.2765 - accuracy: 0.0116 - val_loss: 5.2722 - val_accuracy: 0.0088 - lr: 0.0010 50/50 [==============================] - 37s 733ms/step - loss: 5.2697 - accuracy: 0.0119
[5.269680023193359, 0.011874999850988388]
def plotModelPerformanceCurvs(history):
plt.figure(figsize=(20, 7))
plt.subplot(1, 2, 1)
plt.plot(history.history["accuracy"])
plt.plot(history.history["val_accuracy"])
plt.title("Model Accuracy")
plt.ylabel("Accuracy")
plt.xlabel("Epoch")
plt.legend(["Train", "Val"], loc="upper left")
plt.subplot(1, 2, 2)
plt.plot(history.history["loss"])
plt.plot(history.history["val_loss"])
plt.title("Model Loss")
plt.ylabel("Loss")
plt.xlabel("Epoch")
plt.legend(["Train", "Val"], loc="upper right")
plt.show()
plotModelPerformanceCurvs(history)
Insights:
The training and validation accuracy curves both show an increasing trend over epochs, indicating that the model is learning effectively from the training data and generalizing well to unseen validation data. The final validation accuracy reaches approximately 0.80, indicating that the model correctly classifies around 80% of the validation images.
Training and Validation Loss: The training loss consistently decreases over epochs, indicating that the model is effectively minimizing errors on the training data. Similarly, the validation loss decreases over epochs, although it shows some fluctuations. Overall, the trend is downwards, suggesting that the model generalizes reasonably well to unseen validation data. The final validation loss is around 0.6, indicating that, on average, the model incurs a loss of approximately 0.6 on the validation data.
Overall Performance: Model 5 demonstrates consistent improvement in both training accuracy and reduction in training loss over epochs, indicating effective learning from the training data. The small gap between the training and validation accuracy curves suggests that the model generalizes well to unseen data with relatively low overfitting. The decreasing trends in both training and validation loss indicate that the model is effectively minimizing errors and generalizing well to unseen data. The final validation accuracy of approximately 0.80 is relatively high, indicating that the model performs well in classifying images from the validation set.
Milestone 2¶
Resnet50¶
# Model 6 - ResNet50
from tensorflow.keras.applications.resnet50 import ResNet50
base_model = ResNet50(input_shape=(224, 224, 3), weights='imagenet', include_top=False)
#Freeze all the layers
for layer in base_model.layers[:-10]:
layer.trainable = False
for layer in base_model.layers[-10:]:
layer.trainable = True
x = base_model.output
x = GlobalAveragePooling2D()(x)
#x = Flatten()(x)
#x = Dense(1024, activation='relu')(x)
x = Dense(512, activation='relu')(x)
x = Dropout(0.5)(x)
x = Dense(196, activation='softmax')(x)
ResNet_model = Model(inputs=base_model.input, outputs=x)
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2,
patience=2, min_lr=0.000001, min_delta=0.01,
verbose=2, cooldown=1)
opt = Adam(lr=0.0001)
ResNet_model.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
ResNet_model.summary()
Model: "model_1"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 224, 224, 3 0 []
)]
conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_2[0][0]']
conv1_conv (Conv2D) (None, 112, 112, 64 9472 ['conv1_pad[0][0]']
)
conv1_bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1_conv[0][0]']
)
conv1_relu (Activation) (None, 112, 112, 64 0 ['conv1_bn[0][0]']
)
pool1_pad (ZeroPadding2D) (None, 114, 114, 64 0 ['conv1_relu[0][0]']
)
pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 ['pool1_pad[0][0]']
conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 ['pool1_pool[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block1_1_relu[0][0]']
conv2_block1_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_2_conv[0][0]']
ization)
conv2_block1_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_2_bn[0][0]']
n)
conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 ['pool1_pool[0][0]']
conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block1_2_relu[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_0_conv[0][0]']
ization)
conv2_block1_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_3_conv[0][0]']
ization)
conv2_block1_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_0_bn[0][0]',
'conv2_block1_3_bn[0][0]']
conv2_block1_out (Activation) (None, 56, 56, 256) 0 ['conv2_block1_add[0][0]']
conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block1_out[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block2_1_relu[0][0]']
conv2_block2_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_2_conv[0][0]']
ization)
conv2_block2_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_2_bn[0][0]']
n)
conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block2_2_relu[0][0]']
conv2_block2_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block2_3_conv[0][0]']
ization)
conv2_block2_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_out[0][0]',
'conv2_block2_3_bn[0][0]']
conv2_block2_out (Activation) (None, 56, 56, 256) 0 ['conv2_block2_add[0][0]']
conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block2_out[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block3_1_relu[0][0]']
conv2_block3_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_2_conv[0][0]']
ization)
conv2_block3_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_2_bn[0][0]']
n)
conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block3_2_relu[0][0]']
conv2_block3_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block3_3_conv[0][0]']
ization)
conv2_block3_add (Add) (None, 56, 56, 256) 0 ['conv2_block2_out[0][0]',
'conv2_block3_3_bn[0][0]']
conv2_block3_out (Activation) (None, 56, 56, 256) 0 ['conv2_block3_add[0][0]']
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 ['conv2_block3_out[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block1_1_relu[0][0]']
conv3_block1_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_2_conv[0][0]']
ization)
conv3_block1_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_2_bn[0][0]']
n)
conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 ['conv2_block3_out[0][0]']
conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block1_2_relu[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_0_conv[0][0]']
ization)
conv3_block1_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_3_conv[0][0]']
ization)
conv3_block1_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_0_bn[0][0]',
'conv3_block1_3_bn[0][0]']
conv3_block1_out (Activation) (None, 28, 28, 512) 0 ['conv3_block1_add[0][0]']
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block1_out[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block2_1_relu[0][0]']
conv3_block2_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_2_conv[0][0]']
ization)
conv3_block2_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_2_bn[0][0]']
n)
conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block2_2_relu[0][0]']
conv3_block2_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block2_3_conv[0][0]']
ization)
conv3_block2_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_out[0][0]',
'conv3_block2_3_bn[0][0]']
conv3_block2_out (Activation) (None, 28, 28, 512) 0 ['conv3_block2_add[0][0]']
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block2_out[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block3_1_relu[0][0]']
conv3_block3_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_2_conv[0][0]']
ization)
conv3_block3_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_2_bn[0][0]']
n)
conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block3_2_relu[0][0]']
conv3_block3_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block3_3_conv[0][0]']
ization)
conv3_block3_add (Add) (None, 28, 28, 512) 0 ['conv3_block2_out[0][0]',
'conv3_block3_3_bn[0][0]']
conv3_block3_out (Activation) (None, 28, 28, 512) 0 ['conv3_block3_add[0][0]']
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block3_out[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block4_1_relu[0][0]']
conv3_block4_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_2_conv[0][0]']
ization)
conv3_block4_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_2_bn[0][0]']
n)
conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block4_2_relu[0][0]']
conv3_block4_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block4_3_conv[0][0]']
ization)
conv3_block4_add (Add) (None, 28, 28, 512) 0 ['conv3_block3_out[0][0]',
'conv3_block4_3_bn[0][0]']
conv3_block4_out (Activation) (None, 28, 28, 512) 0 ['conv3_block4_add[0][0]']
conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 ['conv3_block4_out[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block1_1_relu[0][0]']
conv4_block1_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_2_conv[0][0]']
ization)
conv4_block1_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_2_bn[0][0]']
n)
conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024 525312 ['conv3_block4_out[0][0]']
)
conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block1_2_relu[0][0]']
)
conv4_block1_0_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_0_conv[0][0]']
ization) )
conv4_block1_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_3_conv[0][0]']
ization) )
conv4_block1_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_0_bn[0][0]',
) 'conv4_block1_3_bn[0][0]']
conv4_block1_out (Activation) (None, 14, 14, 1024 0 ['conv4_block1_add[0][0]']
)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block1_out[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block2_1_relu[0][0]']
conv4_block2_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_2_conv[0][0]']
ization)
conv4_block2_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_2_bn[0][0]']
n)
conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block2_2_relu[0][0]']
)
conv4_block2_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block2_3_conv[0][0]']
ization) )
conv4_block2_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_out[0][0]',
) 'conv4_block2_3_bn[0][0]']
conv4_block2_out (Activation) (None, 14, 14, 1024 0 ['conv4_block2_add[0][0]']
)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block2_out[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block3_1_relu[0][0]']
conv4_block3_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_2_conv[0][0]']
ization)
conv4_block3_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_2_bn[0][0]']
n)
conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block3_2_relu[0][0]']
)
conv4_block3_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block3_3_conv[0][0]']
ization) )
conv4_block3_add (Add) (None, 14, 14, 1024 0 ['conv4_block2_out[0][0]',
) 'conv4_block3_3_bn[0][0]']
conv4_block3_out (Activation) (None, 14, 14, 1024 0 ['conv4_block3_add[0][0]']
)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block3_out[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block4_1_relu[0][0]']
conv4_block4_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_2_conv[0][0]']
ization)
conv4_block4_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_2_bn[0][0]']
n)
conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block4_2_relu[0][0]']
)
conv4_block4_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block4_3_conv[0][0]']
ization) )
conv4_block4_add (Add) (None, 14, 14, 1024 0 ['conv4_block3_out[0][0]',
) 'conv4_block4_3_bn[0][0]']
conv4_block4_out (Activation) (None, 14, 14, 1024 0 ['conv4_block4_add[0][0]']
)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block4_out[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block5_1_relu[0][0]']
conv4_block5_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_2_conv[0][0]']
ization)
conv4_block5_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_2_bn[0][0]']
n)
conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block5_2_relu[0][0]']
)
conv4_block5_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block5_3_conv[0][0]']
ization) )
conv4_block5_add (Add) (None, 14, 14, 1024 0 ['conv4_block4_out[0][0]',
) 'conv4_block5_3_bn[0][0]']
conv4_block5_out (Activation) (None, 14, 14, 1024 0 ['conv4_block5_add[0][0]']
)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block5_out[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block6_1_relu[0][0]']
conv4_block6_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_2_conv[0][0]']
ization)
conv4_block6_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_2_bn[0][0]']
n)
conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block6_2_relu[0][0]']
)
conv4_block6_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block6_3_conv[0][0]']
ization) )
conv4_block6_add (Add) (None, 14, 14, 1024 0 ['conv4_block5_out[0][0]',
) 'conv4_block6_3_bn[0][0]']
conv4_block6_out (Activation) (None, 14, 14, 1024 0 ['conv4_block6_add[0][0]']
)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 ['conv4_block6_out[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block1_1_relu[0][0]']
conv5_block1_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_2_conv[0][0]']
ization)
conv5_block1_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_2_bn[0][0]']
n)
conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 ['conv4_block6_out[0][0]']
conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block1_2_relu[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_0_conv[0][0]']
ization)
conv5_block1_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_3_conv[0][0]']
ization)
conv5_block1_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_0_bn[0][0]',
'conv5_block1_3_bn[0][0]']
conv5_block1_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block1_add[0][0]']
conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block1_out[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block2_1_relu[0][0]']
conv5_block2_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_2_conv[0][0]']
ization)
conv5_block2_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_2_bn[0][0]']
n)
conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block2_2_relu[0][0]']
conv5_block2_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block2_3_conv[0][0]']
ization)
conv5_block2_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_out[0][0]',
'conv5_block2_3_bn[0][0]']
conv5_block2_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block2_add[0][0]']
conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block2_out[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block3_1_relu[0][0]']
conv5_block3_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_2_conv[0][0]']
ization)
conv5_block3_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_2_bn[0][0]']
n)
conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block3_2_relu[0][0]']
conv5_block3_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block3_3_conv[0][0]']
ization)
conv5_block3_add (Add) (None, 7, 7, 2048) 0 ['conv5_block2_out[0][0]',
'conv5_block3_3_bn[0][0]']
conv5_block3_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block3_add[0][0]']
global_average_pooling2d_1 (Gl (None, 2048) 0 ['conv5_block3_out[0][0]']
obalAveragePooling2D)
dense_2 (Dense) (None, 512) 1049088 ['global_average_pooling2d_1[0][0
]']
dropout_1 (Dropout) (None, 512) 0 ['dense_2[0][0]']
dense_3 (Dense) (None, 196) 100548 ['dropout_1[0][0]']
==================================================================================================
Total params: 24,737,348
Trainable params: 5,615,300
Non-trainable params: 19,122,048
__________________________________________________________________________________________________
# There are 8144 training images and 8041 test images in total
history = ResNet_model.fit_generator(training_set,
steps_per_epoch = int(8144/32),
epochs = 10,
validation_data = test_set,
validation_steps = int(8041/32),
callbacks=[reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_23008\2675838027.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = ResNet_model.fit_generator(training_set,
Epoch 1/10 254/254 [==============================] - 155s 584ms/step - loss: 5.3218 - accuracy: 0.0047 - val_loss: 5.2758 - val_accuracy: 0.0054 - lr: 1.0000e-04 Epoch 2/10 254/254 [==============================] - 148s 584ms/step - loss: 5.2743 - accuracy: 0.0073 - val_loss: 5.2601 - val_accuracy: 0.0080 - lr: 1.0000e-04 Epoch 3/10 254/254 [==============================] - 150s 589ms/step - loss: 5.2638 - accuracy: 0.0074 - val_loss: 5.2540 - val_accuracy: 0.0091 - lr: 1.0000e-04 Epoch 4/10 254/254 [==============================] - ETA: 0s - loss: 5.2531 - accuracy: 0.0102 Epoch 4: ReduceLROnPlateau reducing learning rate to 1.9999999494757503e-05. 254/254 [==============================] - 150s 592ms/step - loss: 5.2531 - accuracy: 0.0102 - val_loss: 5.2551 - val_accuracy: 0.0125 - lr: 1.0000e-04 Epoch 5/10 254/254 [==============================] - 150s 589ms/step - loss: 5.2252 - accuracy: 0.0131 - val_loss: 5.2081 - val_accuracy: 0.0193 - lr: 2.0000e-05 Epoch 6/10 254/254 [==============================] - 150s 589ms/step - loss: 5.2121 - accuracy: 0.0122 - val_loss: 5.1959 - val_accuracy: 0.0209 - lr: 2.0000e-05 Epoch 7/10 254/254 [==============================] - 152s 598ms/step - loss: 5.1986 - accuracy: 0.0166 - val_loss: 5.1842 - val_accuracy: 0.0204 - lr: 2.0000e-05 Epoch 8/10 254/254 [==============================] - 149s 588ms/step - loss: 5.1853 - accuracy: 0.0159 - val_loss: 5.1873 - val_accuracy: 0.0192 - lr: 2.0000e-05 Epoch 9/10 254/254 [==============================] - 150s 591ms/step - loss: 5.1756 - accuracy: 0.0152 - val_loss: 5.1716 - val_accuracy: 0.0213 - lr: 2.0000e-05 Epoch 10/10 254/254 [==============================] - 152s 599ms/step - loss: 5.1613 - accuracy: 0.0194 - val_loss: 5.1389 - val_accuracy: 0.0249 - lr: 2.0000e-05
ResNet_model.save('./res_model.h5')
ResNet_model.save_weights('./res_model.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
Insights:
1) Accuracy Plot:
- Training Accuracy: The training accuracy steadily increases over epochs, which indicates that the model is learning the features of the training data effectively.
- Validation Accuracy: The validation accuracy also increases initially, indicating that the model generalizes well to unseen data. However, if the validation accuracy starts to stagnate or decrease while the training accuracy continues to increase, it could indicate overfitting.
2) Loss Plot:
- Training Loss: The training loss decreases consistently over epochs, which means that the model is improving in minimizing errors on the training data.
- Validation Loss: The validation loss initially decreases, indicating that the model is generalizing well. However, if the validation loss starts to increase while the training loss continues to decrease, it could indicate overfitting.
3) Overall Insights:
- Generalization: The fact that both training and validation metrics are converging is a good sign, suggesting that the model is likely not overfitting severely.
- Stability: If there is significant fluctuation or irregular patterns in the plots, it might indicate instability in training, potentially due to high learning rates or poor convergence.
- Optimization: If the validation metrics plateau while the training metrics continue to improve, it might indicate that the model has reached its optimal performance on the validation set, and further training might not be beneficial without adjustments to the model or training process.
# Performance of Resnet Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 16,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = ResNet_model.predict_generator(test_set, int(8041/16+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_23008\1291638358.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = ResNet_model.predict_generator(test_set, int(8041/16+1))
[[23 0 0 ... 0 1 0]
[ 0 0 0 ... 0 0 0]
[ 0 0 0 ... 0 0 0]
...
[ 1 0 0 ... 0 1 1]
[ 1 0 0 ... 0 1 2]
[ 4 0 0 ... 0 0 1]]
precision recall f1-score support
AM General Hummer SUV 2000 0.07 0.52 0.12 44
Acura Integra Type R 2001 0.00 0.00 0.00 44
Acura RL Sedan 2012 0.00 0.00 0.00 32
Acura TL Sedan 2012 0.00 0.00 0.00 43
Acura TL Type-S 2008 0.00 0.00 0.00 42
Acura TSX Sedan 2012 0.00 0.00 0.00 40
Acura ZDX Hatchback 2012 0.00 0.00 0.00 39
Aston Martin V8 Vantage Convertible 2012 0.00 0.00 0.00 45
Aston Martin V8 Vantage Coupe 2012 0.00 0.00 0.00 41
Aston Martin Virage Convertible 2012 0.00 0.00 0.00 33
Aston Martin Virage Coupe 2012 0.00 0.00 0.00 38
Audi 100 Sedan 1994 0.00 0.00 0.00 40
Audi 100 Wagon 1994 0.00 0.00 0.00 42
Audi A5 Coupe 2012 0.00 0.00 0.00 41
Audi R8 Coupe 2012 0.00 0.00 0.00 43
Audi RS 4 Convertible 2008 0.00 0.00 0.00 36
Audi S4 Sedan 2007 0.00 0.00 0.00 45
Audi S4 Sedan 2012 0.00 0.00 0.00 39
Audi S5 Convertible 2012 0.01 0.40 0.03 42
Audi S5 Coupe 2012 0.00 0.00 0.00 42
Audi S6 Sedan 2011 0.04 0.11 0.06 46
Audi TT Hatchback 2011 0.00 0.00 0.00 40
Audi TT RS Coupe 2012 0.00 0.00 0.00 39
Audi TTS Coupe 2012 0.00 0.00 0.00 42
Audi V8 Sedan 1994 0.00 0.00 0.00 43
BMW 1 Series Convertible 2012 0.00 0.00 0.00 35
BMW 1 Series Coupe 2012 0.00 0.00 0.00 41
BMW 3 Series Sedan 2012 0.00 0.00 0.00 42
BMW 3 Series Wagon 2012 0.00 0.00 0.00 41
BMW 6 Series Convertible 2007 0.00 0.00 0.00 44
BMW ActiveHybrid 5 Sedan 2012 0.02 0.03 0.02 34
BMW M3 Coupe 2012 0.01 0.07 0.02 44
BMW M5 Sedan 2010 0.00 0.00 0.00 41
BMW M6 Convertible 2010 0.00 0.00 0.00 41
BMW X3 SUV 2012 0.00 0.00 0.00 38
BMW X5 SUV 2007 0.00 0.00 0.00 41
BMW X6 SUV 2012 0.00 0.00 0.00 42
BMW Z4 Convertible 2012 0.00 0.00 0.00 40
Bentley Arnage Sedan 2009 0.00 0.00 0.00 39
Bentley Continental Flying Spur Sedan 2007 0.00 0.00 0.00 44
Bentley Continental GT Coupe 2007 0.00 0.00 0.00 46
Bentley Continental GT Coupe 2012 0.00 0.00 0.00 34
Bentley Continental Supersports Conv. Convertible 2012 0.00 0.00 0.00 36
Bentley Mulsanne Sedan 2011 0.00 0.00 0.00 35
Bugatti Veyron 16.4 Convertible 2009 0.00 0.00 0.00 32
Bugatti Veyron 16.4 Coupe 2009 0.00 0.00 0.00 43
Buick Enclave SUV 2012 0.00 0.00 0.00 42
Buick Rainier SUV 2007 0.00 0.00 0.00 42
Buick Regal GS 2012 0.00 0.00 0.00 35
Buick Verano Sedan 2012 0.00 0.00 0.00 37
Cadillac CTS-V Sedan 2012 0.00 0.00 0.00 43
Cadillac Escalade EXT Crew Cab 2007 0.00 0.00 0.00 44
Cadillac SRX SUV 2012 0.00 0.00 0.00 41
Chevrolet Avalanche Crew Cab 2012 0.05 0.16 0.08 45
Chevrolet Camaro Convertible 2012 0.06 0.02 0.03 44
Chevrolet Cobalt SS 2010 0.00 0.00 0.00 41
Chevrolet Corvette Convertible 2012 0.00 0.00 0.00 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.00 0.00 0.00 37
Chevrolet Corvette ZR1 2012 0.00 0.00 0.00 46
Chevrolet Express Cargo Van 2007 0.00 0.00 0.00 29
Chevrolet Express Van 2007 0.00 0.00 0.00 35
Chevrolet HHR SS 2010 0.00 0.00 0.00 36
Chevrolet Impala Sedan 2007 0.00 0.00 0.00 43
Chevrolet Malibu Hybrid Sedan 2010 0.00 0.00 0.00 38
Chevrolet Malibu Sedan 2007 0.00 0.00 0.00 44
Chevrolet Monte Carlo Coupe 2007 0.00 0.00 0.00 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.00 0.00 0.00 42
Chevrolet Silverado 1500 Extended Cab 2012 0.01 0.05 0.02 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.00 0.00 0.00 40
Chevrolet Silverado 1500 Regular Cab 2012 0.00 0.00 0.00 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.00 0.00 0.00 38
Chevrolet Sonic Sedan 2012 0.00 0.00 0.00 44
Chevrolet Tahoe Hybrid SUV 2012 0.00 0.00 0.00 37
Chevrolet TrailBlazer SS 2009 0.00 0.00 0.00 40
Chevrolet Traverse SUV 2012 0.00 0.00 0.00 44
Chrysler 300 SRT-8 2010 0.02 0.06 0.03 48
Chrysler Aspen SUV 2009 0.00 0.00 0.00 43
Chrysler Crossfire Convertible 2008 0.00 0.00 0.00 43
Chrysler PT Cruiser Convertible 2008 0.00 0.00 0.00 45
Chrysler Sebring Convertible 2010 0.00 0.00 0.00 40
Chrysler Town and Country Minivan 2012 0.00 0.00 0.00 37
Daewoo Nubira Wagon 2002 0.03 0.02 0.03 45
Dodge Caliber Wagon 2007 0.00 0.00 0.00 42
Dodge Caliber Wagon 2012 0.00 0.00 0.00 40
Dodge Caravan Minivan 1997 0.00 0.00 0.00 43
Dodge Challenger SRT8 2011 0.02 0.36 0.04 39
Dodge Charger SRT-8 2009 0.00 0.00 0.00 42
Dodge Charger Sedan 2012 0.00 0.00 0.00 41
Dodge Dakota Club Cab 2007 0.00 0.00 0.00 38
Dodge Dakota Crew Cab 2010 0.00 0.00 0.00 41
Dodge Durango SUV 2007 0.00 0.00 0.00 45
Dodge Durango SUV 2012 0.00 0.00 0.00 43
Dodge Journey SUV 2012 0.06 0.02 0.03 44
Dodge Magnum Wagon 2008 0.00 0.00 0.00 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.00 0.00 0.00 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.00 0.00 0.00 44
Dodge Sprinter Cargo Van 2009 0.00 0.00 0.00 39
Eagle Talon Hatchback 1998 0.07 0.04 0.05 46
FIAT 500 Abarth 2012 0.00 0.00 0.00 27
FIAT 500 Convertible 2012 0.06 0.55 0.11 33
Ferrari 458 Italia Convertible 2012 0.00 0.00 0.00 39
Ferrari 458 Italia Coupe 2012 0.02 0.02 0.02 42
Ferrari California Convertible 2012 0.00 0.00 0.00 39
Ferrari FF Coupe 2012 0.00 0.00 0.00 42
Fisker Karma Sedan 2012 0.00 0.00 0.00 43
Ford E-Series Wagon Van 2012 0.06 0.14 0.08 37
Ford Edge SUV 2012 0.01 0.02 0.01 43
Ford Expedition EL SUV 2009 0.00 0.00 0.00 44
Ford F-150 Regular Cab 2007 0.00 0.00 0.00 45
Ford F-150 Regular Cab 2012 0.02 0.02 0.02 42
Ford F-450 Super Duty Crew Cab 2012 0.00 0.00 0.00 41
Ford Fiesta Sedan 2012 0.00 0.00 0.00 42
Ford Focus Sedan 2007 0.00 0.00 0.00 45
Ford Freestar Minivan 2007 0.00 0.00 0.00 44
Ford GT Coupe 2006 0.00 0.00 0.00 45
Ford Mustang Convertible 2007 0.00 0.00 0.00 44
Ford Ranger SuperCab 2011 0.00 0.00 0.00 42
GMC Acadia SUV 2012 0.01 0.02 0.01 44
GMC Canyon Extended Cab 2012 0.00 0.00 0.00 40
GMC Savana Van 2012 0.05 0.38 0.09 68
GMC Terrain SUV 2012 0.00 0.00 0.00 41
GMC Yukon Hybrid SUV 2012 0.00 0.00 0.00 42
Geo Metro Convertible 1993 0.03 0.48 0.05 44
HUMMER H2 SUT Crew Cab 2009 0.01 0.05 0.02 43
HUMMER H3T Crew Cab 2010 0.00 0.00 0.00 39
Honda Accord Coupe 2012 0.02 0.03 0.02 39
Honda Accord Sedan 2012 0.00 0.00 0.00 38
Honda Odyssey Minivan 2007 0.00 0.00 0.00 41
Honda Odyssey Minivan 2012 0.00 0.00 0.00 42
Hyundai Accent Sedan 2012 0.00 0.00 0.00 24
Hyundai Azera Sedan 2012 0.00 0.00 0.00 42
Hyundai Elantra Sedan 2007 0.00 0.00 0.00 42
Hyundai Elantra Touring Hatchback 2012 0.00 0.00 0.00 42
Hyundai Genesis Sedan 2012 0.00 0.00 0.00 43
Hyundai Santa Fe SUV 2012 0.05 0.02 0.03 42
Hyundai Sonata Hybrid Sedan 2012 0.00 0.00 0.00 33
Hyundai Sonata Sedan 2012 0.00 0.00 0.00 39
Hyundai Tucson SUV 2012 0.00 0.00 0.00 43
Hyundai Veloster Hatchback 2012 0.00 0.00 0.00 41
Hyundai Veracruz SUV 2012 0.00 0.00 0.00 42
Infiniti G Coupe IPL 2012 0.00 0.00 0.00 34
Infiniti QX56 SUV 2011 0.00 0.00 0.00 32
Isuzu Ascender SUV 2008 0.03 0.03 0.03 40
Jaguar XK XKR 2012 0.05 0.04 0.05 46
Jeep Compass SUV 2012 0.00 0.00 0.00 42
Jeep Grand Cherokee SUV 2012 0.25 0.02 0.04 45
Jeep Liberty SUV 2012 0.00 0.00 0.00 44
Jeep Patriot SUV 2012 0.02 0.16 0.03 44
Jeep Wrangler SUV 2012 0.01 0.19 0.03 43
Lamborghini Aventador Coupe 2012 0.00 0.00 0.00 43
Lamborghini Diablo Coupe 2001 0.00 0.00 0.00 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.00 0.00 0.00 35
Lamborghini Reventon Coupe 2008 0.00 0.00 0.00 36
Land Rover LR2 SUV 2012 0.01 0.05 0.01 42
Land Rover Range Rover SUV 2012 0.00 0.00 0.00 42
Lincoln Town Car Sedan 2011 0.00 0.00 0.00 39
MINI Cooper Roadster Convertible 2012 0.12 0.19 0.15 36
Maybach Landaulet Convertible 2012 0.00 0.00 0.00 29
Mazda Tribute SUV 2011 0.00 0.00 0.00 36
McLaren MP4-12C Coupe 2012 0.00 0.00 0.00 44
Mercedes-Benz 300-Class Convertible 1993 0.00 0.00 0.00 48
Mercedes-Benz C-Class Sedan 2012 0.00 0.00 0.00 45
Mercedes-Benz E-Class Sedan 2012 0.11 0.05 0.06 43
Mercedes-Benz S-Class Sedan 2012 0.00 0.00 0.00 44
Mercedes-Benz SL-Class Coupe 2009 0.00 0.00 0.00 36
Mercedes-Benz Sprinter Van 2012 0.00 0.00 0.00 41
Mitsubishi Lancer Sedan 2012 0.00 0.00 0.00 47
Nissan 240SX Coupe 1998 0.00 0.00 0.00 46
Nissan Juke Hatchback 2012 0.00 0.00 0.00 44
Nissan Leaf Hatchback 2012 0.00 0.00 0.00 42
Nissan NV Passenger Van 2012 0.00 0.00 0.00 38
Plymouth Neon Coupe 1999 0.02 0.05 0.03 44
Porsche Panamera Sedan 2012 0.00 0.00 0.00 43
Ram C-V Cargo Van Minivan 2012 0.02 0.15 0.03 41
Rolls-Royce Ghost Sedan 2012 0.00 0.00 0.00 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.00 0.00 0.00 30
Rolls-Royce Phantom Sedan 2012 0.00 0.00 0.00 44
Scion xD Hatchback 2012 0.00 0.00 0.00 41
Spyker C8 Convertible 2009 0.14 0.02 0.04 45
Spyker C8 Coupe 2009 0.00 0.00 0.00 42
Suzuki Aerio Sedan 2007 0.00 0.00 0.00 38
Suzuki Kizashi Sedan 2012 0.00 0.00 0.00 46
Suzuki SX4 Hatchback 2012 0.00 0.00 0.00 42
Suzuki SX4 Sedan 2012 0.00 0.00 0.00 40
Tesla Model S Sedan 2012 0.02 0.03 0.02 38
Toyota 4Runner SUV 2012 0.00 0.00 0.00 40
Toyota Camry Sedan 2012 0.00 0.00 0.00 43
Toyota Corolla Sedan 2012 0.00 0.00 0.00 43
Toyota Sequoia SUV 2012 0.00 0.00 0.00 38
Volkswagen Beetle Hatchback 2012 0.00 0.00 0.00 42
Volkswagen Golf Hatchback 1991 0.00 0.00 0.00 46
Volkswagen Golf Hatchback 2012 0.00 0.00 0.00 43
Volvo 240 Sedan 1993 0.00 0.00 0.00 45
Volvo C30 Hatchback 2012 0.00 0.00 0.00 41
Volvo XC90 SUV 2007 0.02 0.02 0.02 43
smart fortwo Convertible 2012 0.04 0.03 0.03 40
accuracy 0.02 8041
macro avg 0.01 0.02 0.01 8041
weighted avg 0.01 0.02 0.01 8041
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\sklearn\metrics\_classification.py:1497: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
_warn_prf(average, modifier, f"{metric.capitalize()} is", len(result))
Mobile Net¶
# Model 7 - MobileNet
from tensorflow.keras.applications.mobilenet import MobileNet
base_model = MobileNet(input_shape=(224, 224, 3), include_top=False, weights='imagenet')
#Freeze all the layers
for layer in base_model.layers:
layer.trainable = True
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1024, activation='relu')(x)
x = Dropout(0.3)(x)
x = Dense(512, activation='relu')(x)
x = Dropout(0.2)(x)
x = Dense(196, activation='softmax')(x)
MobileNet_model = Model(inputs=base_model.input, outputs=x)
reduce_lr = ReduceLROnPlateau(monitor='val_accuracy', factor=0.2,
patience=2, min_lr=0.000001, min_delta=0.01,
verbose=2, cooldown=1)
opt = Adam(lr=0.0001)
MobileNet_model.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
MobileNet_model.summary()
Model: "model_4"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_3 (InputLayer) [(None, 224, 224, 3)] 0
conv1 (Conv2D) (None, 112, 112, 32) 864
conv1_bn (BatchNormalizatio (None, 112, 112, 32) 128
n)
conv1_relu (ReLU) (None, 112, 112, 32) 0
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288
conv_dw_1_bn (BatchNormaliz (None, 112, 112, 32) 128
ation)
conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0
conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048
conv_pw_1_bn (BatchNormaliz (None, 112, 112, 64) 256
ation)
conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576
conv_dw_2_bn (BatchNormaliz (None, 56, 56, 64) 256
ation)
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192
conv_pw_2_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152
conv_dw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384
conv_pw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152
conv_dw_4_bn (BatchNormaliz (None, 28, 28, 128) 512
ation)
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768
conv_pw_4_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304
conv_dw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536
conv_pw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304
conv_dw_6_bn (BatchNormaliz (None, 14, 14, 256) 1024
ation)
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072
conv_pw_6_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_10 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_11 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0
conv_dw_12 (DepthwiseConv2D (None, 7, 7, 512) 4608
)
conv_dw_12_bn (BatchNormali (None, 7, 7, 512) 2048
zation)
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288
conv_pw_12_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0
conv_dw_13 (DepthwiseConv2D (None, 7, 7, 1024) 9216
)
conv_dw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576
conv_pw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0
global_average_pooling2d_2 (None, 1024) 0
(GlobalAveragePooling2D)
dense_6 (Dense) (None, 1024) 1049600
dropout_6 (Dropout) (None, 1024) 0
dense_7 (Dense) (None, 512) 524800
dropout_7 (Dropout) (None, 512) 0
dense_8 (Dense) (None, 196) 100548
=================================================================
Total params: 4,903,812
Trainable params: 4,881,924
Non-trainable params: 21,888
_________________________________________________________________
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. super().__init__(name, **kwargs)
# There are 8144 training images and 8041 test images in total
history = MobileNet_model.fit_generator(training_set,
steps_per_epoch = int(8144/16),
epochs = 20,
validation_data = test_set,
validation_steps = int(8041/16),
callbacks=[reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_26256\1776356803.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = MobileNet_model.fit_generator(training_set,
Epoch 1/20 509/509 [==============================] - 145s 275ms/step - loss: 5.1656 - accuracy: 0.0204 - val_loss: 4.6428 - val_accuracy: 0.0687 - lr: 1.0000e-04 Epoch 2/20 509/509 [==============================] - 131s 257ms/step - loss: 4.1861 - accuracy: 0.1057 - val_loss: 3.1487 - val_accuracy: 0.2776 - lr: 1.0000e-04 Epoch 3/20 509/509 [==============================] - 131s 256ms/step - loss: 3.0088 - accuracy: 0.2735 - val_loss: 2.0941 - val_accuracy: 0.4838 - lr: 1.0000e-04 Epoch 4/20 509/509 [==============================] - 130s 255ms/step - loss: 2.1915 - accuracy: 0.4213 - val_loss: 1.5932 - val_accuracy: 0.5880 - lr: 1.0000e-04 Epoch 5/20 509/509 [==============================] - 130s 256ms/step - loss: 1.6611 - accuracy: 0.5474 - val_loss: 1.2649 - val_accuracy: 0.6558 - lr: 1.0000e-04 Epoch 6/20 509/509 [==============================] - 130s 256ms/step - loss: 1.2821 - accuracy: 0.6373 - val_loss: 1.0916 - val_accuracy: 0.7018 - lr: 1.0000e-04 Epoch 7/20 509/509 [==============================] - 131s 257ms/step - loss: 1.0549 - accuracy: 0.6978 - val_loss: 0.9602 - val_accuracy: 0.7296 - lr: 1.0000e-04 Epoch 8/20 509/509 [==============================] - 130s 255ms/step - loss: 0.8889 - accuracy: 0.7387 - val_loss: 0.8599 - val_accuracy: 0.7534 - lr: 1.0000e-04 Epoch 9/20 509/509 [==============================] - 129s 254ms/step - loss: 0.7256 - accuracy: 0.7812 - val_loss: 0.8326 - val_accuracy: 0.7636 - lr: 1.0000e-04 Epoch 10/20 509/509 [==============================] - 130s 256ms/step - loss: 0.6439 - accuracy: 0.8067 - val_loss: 0.7697 - val_accuracy: 0.7834 - lr: 1.0000e-04 Epoch 11/20 509/509 [==============================] - 130s 255ms/step - loss: 0.5273 - accuracy: 0.8418 - val_loss: 0.7902 - val_accuracy: 0.7785 - lr: 1.0000e-04 Epoch 12/20 509/509 [==============================] - ETA: 0s - loss: 0.4871 - accuracy: 0.8563 Epoch 12: ReduceLROnPlateau reducing learning rate to 1.9999999494757503e-05. 509/509 [==============================] - 131s 257ms/step - loss: 0.4871 - accuracy: 0.8563 - val_loss: 0.7982 - val_accuracy: 0.7779 - lr: 1.0000e-04 Epoch 13/20 509/509 [==============================] - 131s 258ms/step - loss: 0.3259 - accuracy: 0.9014 - val_loss: 0.6251 - val_accuracy: 0.8247 - lr: 2.0000e-05 Epoch 14/20 509/509 [==============================] - 130s 255ms/step - loss: 0.2826 - accuracy: 0.9149 - val_loss: 0.6049 - val_accuracy: 0.8306 - lr: 2.0000e-05 Epoch 15/20 509/509 [==============================] - ETA: 0s - loss: 0.2530 - accuracy: 0.9231 Epoch 15: ReduceLROnPlateau reducing learning rate to 3.999999898951501e-06. 509/509 [==============================] - 131s 258ms/step - loss: 0.2530 - accuracy: 0.9231 - val_loss: 0.6057 - val_accuracy: 0.8323 - lr: 2.0000e-05 Epoch 16/20 509/509 [==============================] - 131s 258ms/step - loss: 0.2274 - accuracy: 0.9331 - val_loss: 0.5969 - val_accuracy: 0.8337 - lr: 4.0000e-06 Epoch 17/20 509/509 [==============================] - 131s 256ms/step - loss: 0.2209 - accuracy: 0.9365 - val_loss: 0.5886 - val_accuracy: 0.8383 - lr: 4.0000e-06 Epoch 18/20 509/509 [==============================] - 130s 256ms/step - loss: 0.2058 - accuracy: 0.9386 - val_loss: 0.5900 - val_accuracy: 0.8378 - lr: 4.0000e-06 Epoch 19/20 509/509 [==============================] - ETA: 0s - loss: 0.2087 - accuracy: 0.9369 Epoch 19: ReduceLROnPlateau reducing learning rate to 1e-06. 509/509 [==============================] - 131s 258ms/step - loss: 0.2087 - accuracy: 0.9369 - val_loss: 0.5900 - val_accuracy: 0.8368 - lr: 4.0000e-06 Epoch 20/20 509/509 [==============================] - 130s 255ms/step - loss: 0.2082 - accuracy: 0.9398 - val_loss: 0.5873 - val_accuracy: 0.8385 - lr: 1.0000e-06
MobileNet_model.save('./mob_model.h5')
MobileNet_model.save_weights('./mob_model.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
Insights
1) Accuracy Plot:
- Training Accuracy: The training accuracy increases steadily over epochs, indicating that the model is learning from the training data.
- Validation Accuracy: Initially, the validation accuracy increases, which suggests that the model is generalizing well to unseen data. However, it seems to fluctuate later on, which could indicate some instability or overfitting.
2) Loss Plot:
- Training Loss: The training loss decreases consistently over epochs, indicating that the model is improving its predictive performance on the training data.
- Validation Loss: The validation loss decreases initially, which is a good sign of generalization. However, similar to validation accuracy, it shows fluctuations later on, possibly indicating overfitting or instability.
3) Overall Insights:
- Overfitting: The fluctuation in validation accuracy and loss after some epochs suggests that the model might be overfitting to the training data. Overfitting occurs when the model learns to memorize the training data instead of generalizing patterns.
- Learning Rate Adjustment: The reduction in learning rate using the ReduceLROnPlateau callback might help stabilize the training process, but the fluctuations suggest that the learning rate adjustment might need further tuning.
- Model Complexity: The model architecture appears to be quite complex with multiple dense layers. It's possible that the model is too complex for the given data, leading to overfitting. Simplifying the model architecture or applying stronger regularization techniques could be beneficial.
# Performance of Mobile Net Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 16,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = MobileNet_model.predict_generator(test_set, int(8041/16+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_26256\1308849154.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = MobileNet_model.predict_generator(test_set, int(8041/16+1))
[[42 0 0 ... 0 0 0]
[ 0 39 0 ... 0 0 0]
[ 0 0 26 ... 0 0 0]
...
[ 0 0 0 ... 36 0 0]
[ 0 0 0 ... 0 37 0]
[ 0 0 0 ... 1 0 37]]
precision recall f1-score support
AM General Hummer SUV 2000 0.91 0.95 0.93 44
Acura Integra Type R 2001 0.81 0.89 0.85 44
Acura RL Sedan 2012 0.70 0.81 0.75 32
Acura TL Sedan 2012 0.74 0.93 0.82 43
Acura TL Type-S 2008 1.00 0.90 0.95 42
Acura TSX Sedan 2012 0.94 0.72 0.82 40
Acura ZDX Hatchback 2012 0.78 0.82 0.80 39
Aston Martin V8 Vantage Convertible 2012 0.69 0.64 0.67 45
Aston Martin V8 Vantage Coupe 2012 0.70 0.68 0.69 41
Aston Martin Virage Convertible 2012 0.83 0.61 0.70 33
Aston Martin Virage Coupe 2012 0.71 0.84 0.77 38
Audi 100 Sedan 1994 0.63 0.68 0.65 40
Audi 100 Wagon 1994 0.69 0.43 0.53 42
Audi A5 Coupe 2012 0.57 0.85 0.69 41
Audi R8 Coupe 2012 0.78 0.84 0.81 43
Audi RS 4 Convertible 2008 0.84 0.75 0.79 36
Audi S4 Sedan 2007 0.79 0.82 0.80 45
Audi S4 Sedan 2012 0.59 0.51 0.55 39
Audi S5 Convertible 2012 0.83 0.69 0.75 42
Audi S5 Coupe 2012 0.66 0.45 0.54 42
Audi S6 Sedan 2011 0.78 0.83 0.80 46
Audi TT Hatchback 2011 0.49 0.70 0.58 40
Audi TT RS Coupe 2012 0.71 0.64 0.68 39
Audi TTS Coupe 2012 0.62 0.48 0.54 42
Audi V8 Sedan 1994 0.58 0.67 0.62 43
BMW 1 Series Convertible 2012 0.87 0.97 0.92 35
BMW 1 Series Coupe 2012 0.97 0.95 0.96 41
BMW 3 Series Sedan 2012 0.77 0.81 0.79 42
BMW 3 Series Wagon 2012 0.74 0.83 0.78 41
BMW 6 Series Convertible 2007 0.91 0.73 0.81 44
BMW ActiveHybrid 5 Sedan 2012 0.86 0.94 0.90 34
BMW M3 Coupe 2012 0.89 0.89 0.89 44
BMW M5 Sedan 2010 0.82 0.80 0.81 41
BMW M6 Convertible 2010 0.67 0.80 0.73 41
BMW X3 SUV 2012 0.95 0.97 0.96 38
BMW X5 SUV 2007 0.86 0.93 0.89 41
BMW X6 SUV 2012 1.00 0.86 0.92 42
BMW Z4 Convertible 2012 1.00 0.80 0.89 40
Bentley Arnage Sedan 2009 0.80 1.00 0.89 39
Bentley Continental Flying Spur Sedan 2007 0.87 0.89 0.88 44
Bentley Continental GT Coupe 2007 0.69 0.74 0.72 46
Bentley Continental GT Coupe 2012 0.76 0.76 0.76 34
Bentley Continental Supersports Conv. Convertible 2012 0.90 0.75 0.82 36
Bentley Mulsanne Sedan 2011 0.86 0.91 0.89 35
Bugatti Veyron 16.4 Convertible 2009 0.70 0.59 0.64 32
Bugatti Veyron 16.4 Coupe 2009 0.69 0.81 0.74 43
Buick Enclave SUV 2012 0.90 0.90 0.90 42
Buick Rainier SUV 2007 0.84 0.86 0.85 42
Buick Regal GS 2012 0.94 0.89 0.91 35
Buick Verano Sedan 2012 0.81 0.92 0.86 37
Cadillac CTS-V Sedan 2012 0.89 0.98 0.93 43
Cadillac Escalade EXT Crew Cab 2007 0.98 0.95 0.97 44
Cadillac SRX SUV 2012 1.00 0.93 0.96 41
Chevrolet Avalanche Crew Cab 2012 0.83 0.78 0.80 45
Chevrolet Camaro Convertible 2012 0.84 0.82 0.83 44
Chevrolet Cobalt SS 2010 0.92 0.80 0.86 41
Chevrolet Corvette Convertible 2012 0.73 0.82 0.77 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.84 0.86 0.85 37
Chevrolet Corvette ZR1 2012 0.81 0.74 0.77 46
Chevrolet Express Cargo Van 2007 0.45 0.52 0.48 29
Chevrolet Express Van 2007 0.46 0.34 0.39 35
Chevrolet HHR SS 2010 0.97 0.97 0.97 36
Chevrolet Impala Sedan 2007 0.86 0.84 0.85 43
Chevrolet Malibu Hybrid Sedan 2010 0.89 0.89 0.89 38
Chevrolet Malibu Sedan 2007 0.80 0.80 0.80 44
Chevrolet Monte Carlo Coupe 2007 0.81 0.78 0.80 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.88 0.83 0.85 42
Chevrolet Silverado 1500 Extended Cab 2012 0.69 0.67 0.68 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.71 0.60 0.65 40
Chevrolet Silverado 1500 Regular Cab 2012 0.60 0.77 0.67 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.62 0.66 0.64 38
Chevrolet Sonic Sedan 2012 0.85 0.80 0.82 44
Chevrolet Tahoe Hybrid SUV 2012 0.79 0.81 0.80 37
Chevrolet TrailBlazer SS 2009 0.95 0.97 0.96 40
Chevrolet Traverse SUV 2012 0.93 0.89 0.91 44
Chrysler 300 SRT-8 2010 0.77 0.75 0.76 48
Chrysler Aspen SUV 2009 0.97 0.91 0.94 43
Chrysler Crossfire Convertible 2008 0.91 0.91 0.91 43
Chrysler PT Cruiser Convertible 2008 0.98 1.00 0.99 45
Chrysler Sebring Convertible 2010 0.91 0.78 0.84 40
Chrysler Town and Country Minivan 2012 0.89 0.89 0.89 37
Daewoo Nubira Wagon 2002 0.95 0.89 0.92 45
Dodge Caliber Wagon 2007 0.80 0.88 0.84 42
Dodge Caliber Wagon 2012 0.77 0.75 0.76 40
Dodge Caravan Minivan 1997 1.00 0.95 0.98 43
Dodge Challenger SRT8 2011 0.87 1.00 0.93 39
Dodge Charger SRT-8 2009 0.80 0.88 0.84 42
Dodge Charger Sedan 2012 0.84 0.76 0.79 41
Dodge Dakota Club Cab 2007 0.76 0.89 0.82 38
Dodge Dakota Crew Cab 2010 1.00 0.83 0.91 41
Dodge Durango SUV 2007 0.93 0.93 0.93 45
Dodge Durango SUV 2012 0.87 0.95 0.91 43
Dodge Journey SUV 2012 0.97 0.89 0.93 44
Dodge Magnum Wagon 2008 0.80 0.82 0.81 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.90 0.86 0.88 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.87 0.77 0.82 44
Dodge Sprinter Cargo Van 2009 0.79 0.59 0.68 39
Eagle Talon Hatchback 1998 0.93 0.83 0.87 46
FIAT 500 Abarth 2012 0.90 1.00 0.95 27
FIAT 500 Convertible 2012 1.00 0.94 0.97 33
Ferrari 458 Italia Convertible 2012 0.61 0.72 0.66 39
Ferrari 458 Italia Coupe 2012 0.79 0.55 0.65 42
Ferrari California Convertible 2012 0.83 0.90 0.86 39
Ferrari FF Coupe 2012 0.92 0.79 0.85 42
Fisker Karma Sedan 2012 0.93 0.95 0.94 43
Ford E-Series Wagon Van 2012 0.86 0.97 0.91 37
Ford Edge SUV 2012 0.89 0.98 0.93 43
Ford Expedition EL SUV 2009 0.87 0.93 0.90 44
Ford F-150 Regular Cab 2007 0.89 0.91 0.90 45
Ford F-150 Regular Cab 2012 1.00 0.90 0.95 42
Ford F-450 Super Duty Crew Cab 2012 0.98 0.98 0.98 41
Ford Fiesta Sedan 2012 0.85 0.81 0.83 42
Ford Focus Sedan 2007 0.81 0.87 0.84 45
Ford Freestar Minivan 2007 0.93 0.93 0.93 44
Ford GT Coupe 2006 0.84 0.91 0.87 45
Ford Mustang Convertible 2007 0.85 0.80 0.82 44
Ford Ranger SuperCab 2011 0.87 0.93 0.90 42
GMC Acadia SUV 2012 0.91 0.95 0.93 44
GMC Canyon Extended Cab 2012 0.80 0.90 0.85 40
GMC Savana Van 2012 0.74 0.82 0.78 68
GMC Terrain SUV 2012 0.95 0.93 0.94 41
GMC Yukon Hybrid SUV 2012 0.91 0.74 0.82 42
Geo Metro Convertible 1993 0.91 0.89 0.90 44
HUMMER H2 SUT Crew Cab 2009 0.92 0.77 0.84 43
HUMMER H3T Crew Cab 2010 0.85 0.87 0.86 39
Honda Accord Coupe 2012 0.94 0.85 0.89 39
Honda Accord Sedan 2012 0.78 0.84 0.81 38
Honda Odyssey Minivan 2007 0.84 0.90 0.87 41
Honda Odyssey Minivan 2012 0.85 0.95 0.90 42
Hyundai Accent Sedan 2012 0.69 0.75 0.72 24
Hyundai Azera Sedan 2012 0.82 0.76 0.79 42
Hyundai Elantra Sedan 2007 0.87 0.81 0.84 42
Hyundai Elantra Touring Hatchback 2012 0.82 0.95 0.88 42
Hyundai Genesis Sedan 2012 0.72 0.91 0.80 43
Hyundai Santa Fe SUV 2012 0.95 0.88 0.91 42
Hyundai Sonata Hybrid Sedan 2012 0.89 0.94 0.91 33
Hyundai Sonata Sedan 2012 0.91 0.79 0.85 39
Hyundai Tucson SUV 2012 0.93 0.93 0.93 43
Hyundai Veloster Hatchback 2012 0.95 0.95 0.95 41
Hyundai Veracruz SUV 2012 0.75 0.64 0.69 42
Infiniti G Coupe IPL 2012 0.93 0.82 0.88 34
Infiniti QX56 SUV 2011 0.97 0.94 0.95 32
Isuzu Ascender SUV 2008 0.95 0.88 0.91 40
Jaguar XK XKR 2012 0.77 0.78 0.77 46
Jeep Compass SUV 2012 0.80 0.88 0.84 42
Jeep Grand Cherokee SUV 2012 0.89 0.87 0.88 45
Jeep Liberty SUV 2012 0.91 0.91 0.91 44
Jeep Patriot SUV 2012 0.97 0.84 0.90 44
Jeep Wrangler SUV 2012 0.98 0.98 0.98 43
Lamborghini Aventador Coupe 2012 0.88 0.65 0.75 43
Lamborghini Diablo Coupe 2001 0.82 0.91 0.86 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.89 0.89 0.89 35
Lamborghini Reventon Coupe 2008 0.71 0.94 0.81 36
Land Rover LR2 SUV 2012 0.95 0.83 0.89 42
Land Rover Range Rover SUV 2012 0.93 0.90 0.92 42
Lincoln Town Car Sedan 2011 0.95 0.95 0.95 39
MINI Cooper Roadster Convertible 2012 0.92 0.97 0.95 36
Maybach Landaulet Convertible 2012 0.93 0.90 0.91 29
Mazda Tribute SUV 2011 0.95 0.97 0.96 36
McLaren MP4-12C Coupe 2012 0.80 0.93 0.86 44
Mercedes-Benz 300-Class Convertible 1993 0.86 0.90 0.88 48
Mercedes-Benz C-Class Sedan 2012 0.91 0.87 0.89 45
Mercedes-Benz E-Class Sedan 2012 0.77 0.84 0.80 43
Mercedes-Benz S-Class Sedan 2012 0.83 0.86 0.84 44
Mercedes-Benz SL-Class Coupe 2009 0.88 0.78 0.82 36
Mercedes-Benz Sprinter Van 2012 0.70 0.95 0.80 41
Mitsubishi Lancer Sedan 2012 0.82 0.85 0.83 47
Nissan 240SX Coupe 1998 0.90 0.96 0.93 46
Nissan Juke Hatchback 2012 0.91 0.89 0.90 44
Nissan Leaf Hatchback 2012 1.00 0.93 0.96 42
Nissan NV Passenger Van 2012 0.94 0.87 0.90 38
Plymouth Neon Coupe 1999 0.93 0.91 0.92 44
Porsche Panamera Sedan 2012 0.75 0.88 0.81 43
Ram C-V Cargo Van Minivan 2012 0.94 0.80 0.87 41
Rolls-Royce Ghost Sedan 2012 0.71 0.92 0.80 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.80 0.80 0.80 30
Rolls-Royce Phantom Sedan 2012 0.88 0.66 0.75 44
Scion xD Hatchback 2012 0.88 0.90 0.89 41
Spyker C8 Convertible 2009 0.86 0.82 0.84 45
Spyker C8 Coupe 2009 0.77 0.64 0.70 42
Suzuki Aerio Sedan 2007 0.65 0.74 0.69 38
Suzuki Kizashi Sedan 2012 0.66 0.76 0.71 46
Suzuki SX4 Hatchback 2012 0.87 0.93 0.90 42
Suzuki SX4 Sedan 2012 0.89 0.62 0.74 40
Tesla Model S Sedan 2012 0.95 0.95 0.95 38
Toyota 4Runner SUV 2012 0.93 0.95 0.94 40
Toyota Camry Sedan 2012 0.85 0.95 0.90 43
Toyota Corolla Sedan 2012 0.89 0.77 0.82 43
Toyota Sequoia SUV 2012 0.94 0.89 0.92 38
Volkswagen Beetle Hatchback 2012 0.93 1.00 0.97 42
Volkswagen Golf Hatchback 1991 0.87 0.98 0.92 46
Volkswagen Golf Hatchback 2012 0.97 0.70 0.81 43
Volvo 240 Sedan 1993 0.85 0.98 0.91 45
Volvo C30 Hatchback 2012 0.95 0.88 0.91 41
Volvo XC90 SUV 2007 0.90 0.86 0.88 43
smart fortwo Convertible 2012 1.00 0.93 0.96 40
accuracy 0.84 8041
macro avg 0.84 0.84 0.84 8041
weighted avg 0.84 0.84 0.84 8041
EfficientNet¶
# Model 8 -EfficientNet
from tensorflow.keras.applications.efficientnet import EfficientNetB0
base_model = EfficientNetB0(input_shape=(224, 224, 3), weights='imagenet', include_top=False)
#Freeze all the layers|
for layer in base_model.layers:
layer.trainable = True
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1024, activation='relu')(x)
x = Dropout(0.3)(x)
x = Dense(512, activation='relu')(x)
x = Dropout(0.2)(x)
x = Dense(196, activation='softmax')(x)
EffNet_model = Model(inputs=base_model.input, outputs=x)
reduce_lr = ReduceLROnPlateau(monitor='val_accuracy', factor=0.2,
patience=2, min_lr=0.00000001, min_delta=0.01,
verbose=2, cooldown=1)
opt = Adam(lr=0.0001)
EffNet_model.compile(optimizer = opt, loss = 'categorical_crossentropy', metrics = ['accuracy'])
EffNet_model.summary()
Model: "model_5"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_4 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling_2 (Rescaling) (None, 224, 224, 3) 0 ['input_4[0][0]']
normalization_1 (Normalization (None, 224, 224, 3) 7 ['rescaling_2[0][0]']
)
rescaling_3 (Rescaling) (None, 224, 224, 3) 0 ['normalization_1[0][0]']
stem_conv_pad (ZeroPadding2D) (None, 225, 225, 3) 0 ['rescaling_3[0][0]']
stem_conv (Conv2D) (None, 112, 112, 32 864 ['stem_conv_pad[0][0]']
)
stem_bn (BatchNormalization) (None, 112, 112, 32 128 ['stem_conv[0][0]']
)
stem_activation (Activation) (None, 112, 112, 32 0 ['stem_bn[0][0]']
)
block1a_dwconv (DepthwiseConv2 (None, 112, 112, 32 288 ['stem_activation[0][0]']
D) )
block1a_bn (BatchNormalization (None, 112, 112, 32 128 ['block1a_dwconv[0][0]']
) )
block1a_activation (Activation (None, 112, 112, 32 0 ['block1a_bn[0][0]']
) )
block1a_se_squeeze (GlobalAver (None, 32) 0 ['block1a_activation[0][0]']
agePooling2D)
block1a_se_reshape (Reshape) (None, 1, 1, 32) 0 ['block1a_se_squeeze[0][0]']
block1a_se_reduce (Conv2D) (None, 1, 1, 8) 264 ['block1a_se_reshape[0][0]']
block1a_se_expand (Conv2D) (None, 1, 1, 32) 288 ['block1a_se_reduce[0][0]']
block1a_se_excite (Multiply) (None, 112, 112, 32 0 ['block1a_activation[0][0]',
) 'block1a_se_expand[0][0]']
block1a_project_conv (Conv2D) (None, 112, 112, 16 512 ['block1a_se_excite[0][0]']
)
block1a_project_bn (BatchNorma (None, 112, 112, 16 64 ['block1a_project_conv[0][0]']
lization) )
block2a_expand_conv (Conv2D) (None, 112, 112, 96 1536 ['block1a_project_bn[0][0]']
)
block2a_expand_bn (BatchNormal (None, 112, 112, 96 384 ['block2a_expand_conv[0][0]']
ization) )
block2a_expand_activation (Act (None, 112, 112, 96 0 ['block2a_expand_bn[0][0]']
ivation) )
block2a_dwconv_pad (ZeroPaddin (None, 113, 113, 96 0 ['block2a_expand_activation[0][0]
g2D) ) ']
block2a_dwconv (DepthwiseConv2 (None, 56, 56, 96) 864 ['block2a_dwconv_pad[0][0]']
D)
block2a_bn (BatchNormalization (None, 56, 56, 96) 384 ['block2a_dwconv[0][0]']
)
block2a_activation (Activation (None, 56, 56, 96) 0 ['block2a_bn[0][0]']
)
block2a_se_squeeze (GlobalAver (None, 96) 0 ['block2a_activation[0][0]']
agePooling2D)
block2a_se_reshape (Reshape) (None, 1, 1, 96) 0 ['block2a_se_squeeze[0][0]']
block2a_se_reduce (Conv2D) (None, 1, 1, 4) 388 ['block2a_se_reshape[0][0]']
block2a_se_expand (Conv2D) (None, 1, 1, 96) 480 ['block2a_se_reduce[0][0]']
block2a_se_excite (Multiply) (None, 56, 56, 96) 0 ['block2a_activation[0][0]',
'block2a_se_expand[0][0]']
block2a_project_conv (Conv2D) (None, 56, 56, 24) 2304 ['block2a_se_excite[0][0]']
block2a_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2a_project_conv[0][0]']
lization)
block2b_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2a_project_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block2b_expand_conv[0][0]']
ization)
block2b_expand_activation (Act (None, 56, 56, 144) 0 ['block2b_expand_bn[0][0]']
ivation)
block2b_dwconv (DepthwiseConv2 (None, 56, 56, 144) 1296 ['block2b_expand_activation[0][0]
D) ']
block2b_bn (BatchNormalization (None, 56, 56, 144) 576 ['block2b_dwconv[0][0]']
)
block2b_activation (Activation (None, 56, 56, 144) 0 ['block2b_bn[0][0]']
)
block2b_se_squeeze (GlobalAver (None, 144) 0 ['block2b_activation[0][0]']
agePooling2D)
block2b_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block2b_se_squeeze[0][0]']
block2b_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block2b_se_reshape[0][0]']
block2b_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block2b_se_reduce[0][0]']
block2b_se_excite (Multiply) (None, 56, 56, 144) 0 ['block2b_activation[0][0]',
'block2b_se_expand[0][0]']
block2b_project_conv (Conv2D) (None, 56, 56, 24) 3456 ['block2b_se_excite[0][0]']
block2b_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2b_project_conv[0][0]']
lization)
block2b_drop (Dropout) (None, 56, 56, 24) 0 ['block2b_project_bn[0][0]']
block2b_add (Add) (None, 56, 56, 24) 0 ['block2b_drop[0][0]',
'block2a_project_bn[0][0]']
block3a_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2b_add[0][0]']
block3a_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block3a_expand_conv[0][0]']
ization)
block3a_expand_activation (Act (None, 56, 56, 144) 0 ['block3a_expand_bn[0][0]']
ivation)
block3a_dwconv_pad (ZeroPaddin (None, 59, 59, 144) 0 ['block3a_expand_activation[0][0]
g2D) ']
block3a_dwconv (DepthwiseConv2 (None, 28, 28, 144) 3600 ['block3a_dwconv_pad[0][0]']
D)
block3a_bn (BatchNormalization (None, 28, 28, 144) 576 ['block3a_dwconv[0][0]']
)
block3a_activation (Activation (None, 28, 28, 144) 0 ['block3a_bn[0][0]']
)
block3a_se_squeeze (GlobalAver (None, 144) 0 ['block3a_activation[0][0]']
agePooling2D)
block3a_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block3a_se_squeeze[0][0]']
block3a_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block3a_se_reshape[0][0]']
block3a_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block3a_se_reduce[0][0]']
block3a_se_excite (Multiply) (None, 28, 28, 144) 0 ['block3a_activation[0][0]',
'block3a_se_expand[0][0]']
block3a_project_conv (Conv2D) (None, 28, 28, 40) 5760 ['block3a_se_excite[0][0]']
block3a_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3a_project_conv[0][0]']
lization)
block3b_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3a_project_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block3b_expand_conv[0][0]']
ization)
block3b_expand_activation (Act (None, 28, 28, 240) 0 ['block3b_expand_bn[0][0]']
ivation)
block3b_dwconv (DepthwiseConv2 (None, 28, 28, 240) 6000 ['block3b_expand_activation[0][0]
D) ']
block3b_bn (BatchNormalization (None, 28, 28, 240) 960 ['block3b_dwconv[0][0]']
)
block3b_activation (Activation (None, 28, 28, 240) 0 ['block3b_bn[0][0]']
)
block3b_se_squeeze (GlobalAver (None, 240) 0 ['block3b_activation[0][0]']
agePooling2D)
block3b_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block3b_se_squeeze[0][0]']
block3b_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block3b_se_reshape[0][0]']
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. super().__init__(name, **kwargs)
block3b_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block3b_se_reduce[0][0]']
block3b_se_excite (Multiply) (None, 28, 28, 240) 0 ['block3b_activation[0][0]',
'block3b_se_expand[0][0]']
block3b_project_conv (Conv2D) (None, 28, 28, 40) 9600 ['block3b_se_excite[0][0]']
block3b_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3b_project_conv[0][0]']
lization)
block3b_drop (Dropout) (None, 28, 28, 40) 0 ['block3b_project_bn[0][0]']
block3b_add (Add) (None, 28, 28, 40) 0 ['block3b_drop[0][0]',
'block3a_project_bn[0][0]']
block4a_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3b_add[0][0]']
block4a_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block4a_expand_conv[0][0]']
ization)
block4a_expand_activation (Act (None, 28, 28, 240) 0 ['block4a_expand_bn[0][0]']
ivation)
block4a_dwconv_pad (ZeroPaddin (None, 29, 29, 240) 0 ['block4a_expand_activation[0][0]
g2D) ']
block4a_dwconv (DepthwiseConv2 (None, 14, 14, 240) 2160 ['block4a_dwconv_pad[0][0]']
D)
block4a_bn (BatchNormalization (None, 14, 14, 240) 960 ['block4a_dwconv[0][0]']
)
block4a_activation (Activation (None, 14, 14, 240) 0 ['block4a_bn[0][0]']
)
block4a_se_squeeze (GlobalAver (None, 240) 0 ['block4a_activation[0][0]']
agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block4a_se_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block4a_se_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block4a_se_reduce[0][0]']
block4a_se_excite (Multiply) (None, 14, 14, 240) 0 ['block4a_activation[0][0]',
'block4a_se_expand[0][0]']
block4a_project_conv (Conv2D) (None, 14, 14, 80) 19200 ['block4a_se_excite[0][0]']
block4a_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4a_project_conv[0][0]']
lization)
block4b_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4a_project_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4b_expand_conv[0][0]']
ization)
block4b_expand_activation (Act (None, 14, 14, 480) 0 ['block4b_expand_bn[0][0]']
ivation)
block4b_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4b_expand_activation[0][0]
D) ']
block4b_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4b_dwconv[0][0]']
)
block4b_activation (Activation (None, 14, 14, 480) 0 ['block4b_bn[0][0]']
)
block4b_se_squeeze (GlobalAver (None, 480) 0 ['block4b_activation[0][0]']
agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4b_se_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4b_se_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4b_se_reduce[0][0]']
block4b_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4b_activation[0][0]',
'block4b_se_expand[0][0]']
block4b_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4b_se_excite[0][0]']
block4b_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4b_project_conv[0][0]']
lization)
block4b_drop (Dropout) (None, 14, 14, 80) 0 ['block4b_project_bn[0][0]']
block4b_add (Add) (None, 14, 14, 80) 0 ['block4b_drop[0][0]',
'block4a_project_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4b_add[0][0]']
block4c_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4c_expand_conv[0][0]']
ization)
block4c_expand_activation (Act (None, 14, 14, 480) 0 ['block4c_expand_bn[0][0]']
ivation)
block4c_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4c_expand_activation[0][0]
D) ']
block4c_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4c_dwconv[0][0]']
)
block4c_activation (Activation (None, 14, 14, 480) 0 ['block4c_bn[0][0]']
)
block4c_se_squeeze (GlobalAver (None, 480) 0 ['block4c_activation[0][0]']
agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4c_se_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4c_se_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4c_se_reduce[0][0]']
block4c_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4c_activation[0][0]',
'block4c_se_expand[0][0]']
block4c_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4c_se_excite[0][0]']
block4c_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4c_project_conv[0][0]']
lization)
block4c_drop (Dropout) (None, 14, 14, 80) 0 ['block4c_project_bn[0][0]']
block4c_add (Add) (None, 14, 14, 80) 0 ['block4c_drop[0][0]',
'block4b_add[0][0]']
block5a_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4c_add[0][0]']
block5a_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block5a_expand_conv[0][0]']
ization)
block5a_expand_activation (Act (None, 14, 14, 480) 0 ['block5a_expand_bn[0][0]']
ivation)
block5a_dwconv (DepthwiseConv2 (None, 14, 14, 480) 12000 ['block5a_expand_activation[0][0]
D) ']
block5a_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block5a_dwconv[0][0]']
)
block5a_activation (Activation (None, 14, 14, 480) 0 ['block5a_bn[0][0]']
)
block5a_se_squeeze (GlobalAver (None, 480) 0 ['block5a_activation[0][0]']
agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block5a_se_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block5a_se_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block5a_se_reduce[0][0]']
block5a_se_excite (Multiply) (None, 14, 14, 480) 0 ['block5a_activation[0][0]',
'block5a_se_expand[0][0]']
block5a_project_conv (Conv2D) (None, 14, 14, 112) 53760 ['block5a_se_excite[0][0]']
block5a_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5a_project_conv[0][0]']
lization)
block5b_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5a_project_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5b_expand_conv[0][0]']
ization)
block5b_expand_activation (Act (None, 14, 14, 672) 0 ['block5b_expand_bn[0][0]']
ivation)
block5b_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5b_expand_activation[0][0]
D) ']
block5b_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5b_dwconv[0][0]']
)
block5b_activation (Activation (None, 14, 14, 672) 0 ['block5b_bn[0][0]']
)
block5b_se_squeeze (GlobalAver (None, 672) 0 ['block5b_activation[0][0]']
agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5b_se_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5b_se_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5b_se_reduce[0][0]']
block5b_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5b_activation[0][0]',
'block5b_se_expand[0][0]']
block5b_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5b_se_excite[0][0]']
block5b_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5b_project_conv[0][0]']
lization)
block5b_drop (Dropout) (None, 14, 14, 112) 0 ['block5b_project_bn[0][0]']
block5b_add (Add) (None, 14, 14, 112) 0 ['block5b_drop[0][0]',
'block5a_project_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5b_add[0][0]']
block5c_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5c_expand_conv[0][0]']
ization)
block5c_expand_activation (Act (None, 14, 14, 672) 0 ['block5c_expand_bn[0][0]']
ivation)
block5c_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5c_expand_activation[0][0]
D) ']
block5c_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5c_dwconv[0][0]']
)
block5c_activation (Activation (None, 14, 14, 672) 0 ['block5c_bn[0][0]']
)
block5c_se_squeeze (GlobalAver (None, 672) 0 ['block5c_activation[0][0]']
agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5c_se_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5c_se_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5c_se_reduce[0][0]']
block5c_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5c_activation[0][0]',
'block5c_se_expand[0][0]']
block5c_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5c_se_excite[0][0]']
block5c_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5c_project_conv[0][0]']
lization)
block5c_drop (Dropout) (None, 14, 14, 112) 0 ['block5c_project_bn[0][0]']
block5c_add (Add) (None, 14, 14, 112) 0 ['block5c_drop[0][0]',
'block5b_add[0][0]']
block6a_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5c_add[0][0]']
block6a_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block6a_expand_conv[0][0]']
ization)
block6a_expand_activation (Act (None, 14, 14, 672) 0 ['block6a_expand_bn[0][0]']
ivation)
block6a_dwconv_pad (ZeroPaddin (None, 17, 17, 672) 0 ['block6a_expand_activation[0][0]
g2D) ']
block6a_dwconv (DepthwiseConv2 (None, 7, 7, 672) 16800 ['block6a_dwconv_pad[0][0]']
D)
block6a_bn (BatchNormalization (None, 7, 7, 672) 2688 ['block6a_dwconv[0][0]']
)
block6a_activation (Activation (None, 7, 7, 672) 0 ['block6a_bn[0][0]']
)
block6a_se_squeeze (GlobalAver (None, 672) 0 ['block6a_activation[0][0]']
agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block6a_se_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block6a_se_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block6a_se_reduce[0][0]']
block6a_se_excite (Multiply) (None, 7, 7, 672) 0 ['block6a_activation[0][0]',
'block6a_se_expand[0][0]']
block6a_project_conv (Conv2D) (None, 7, 7, 192) 129024 ['block6a_se_excite[0][0]']
block6a_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6a_project_conv[0][0]']
lization)
block6b_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6a_project_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6b_expand_conv[0][0]']
ization)
block6b_expand_activation (Act (None, 7, 7, 1152) 0 ['block6b_expand_bn[0][0]']
ivation)
block6b_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6b_expand_activation[0][0]
D) ']
block6b_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6b_dwconv[0][0]']
)
block6b_activation (Activation (None, 7, 7, 1152) 0 ['block6b_bn[0][0]']
)
block6b_se_squeeze (GlobalAver (None, 1152) 0 ['block6b_activation[0][0]']
agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6b_se_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6b_se_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6b_se_reduce[0][0]']
block6b_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6b_activation[0][0]',
'block6b_se_expand[0][0]']
block6b_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6b_se_excite[0][0]']
block6b_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6b_project_conv[0][0]']
lization)
block6b_drop (Dropout) (None, 7, 7, 192) 0 ['block6b_project_bn[0][0]']
block6b_add (Add) (None, 7, 7, 192) 0 ['block6b_drop[0][0]',
'block6a_project_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6b_add[0][0]']
block6c_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6c_expand_conv[0][0]']
ization)
block6c_expand_activation (Act (None, 7, 7, 1152) 0 ['block6c_expand_bn[0][0]']
ivation)
block6c_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6c_expand_activation[0][0]
D) ']
block6c_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6c_dwconv[0][0]']
)
block6c_activation (Activation (None, 7, 7, 1152) 0 ['block6c_bn[0][0]']
)
block6c_se_squeeze (GlobalAver (None, 1152) 0 ['block6c_activation[0][0]']
agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6c_se_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6c_se_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6c_se_reduce[0][0]']
block6c_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6c_activation[0][0]',
'block6c_se_expand[0][0]']
block6c_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6c_se_excite[0][0]']
block6c_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6c_project_conv[0][0]']
lization)
block6c_drop (Dropout) (None, 7, 7, 192) 0 ['block6c_project_bn[0][0]']
block6c_add (Add) (None, 7, 7, 192) 0 ['block6c_drop[0][0]',
'block6b_add[0][0]']
block6d_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6c_add[0][0]']
block6d_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6d_expand_conv[0][0]']
ization)
block6d_expand_activation (Act (None, 7, 7, 1152) 0 ['block6d_expand_bn[0][0]']
ivation)
block6d_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6d_expand_activation[0][0]
D) ']
block6d_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6d_dwconv[0][0]']
)
block6d_activation (Activation (None, 7, 7, 1152) 0 ['block6d_bn[0][0]']
)
block6d_se_squeeze (GlobalAver (None, 1152) 0 ['block6d_activation[0][0]']
agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6d_se_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6d_se_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6d_se_reduce[0][0]']
block6d_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6d_activation[0][0]',
'block6d_se_expand[0][0]']
block6d_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6d_se_excite[0][0]']
block6d_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6d_project_conv[0][0]']
lization)
block6d_drop (Dropout) (None, 7, 7, 192) 0 ['block6d_project_bn[0][0]']
block6d_add (Add) (None, 7, 7, 192) 0 ['block6d_drop[0][0]',
'block6c_add[0][0]']
block7a_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6d_add[0][0]']
block7a_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block7a_expand_conv[0][0]']
ization)
block7a_expand_activation (Act (None, 7, 7, 1152) 0 ['block7a_expand_bn[0][0]']
ivation)
block7a_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 10368 ['block7a_expand_activation[0][0]
D) ']
block7a_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block7a_dwconv[0][0]']
)
block7a_activation (Activation (None, 7, 7, 1152) 0 ['block7a_bn[0][0]']
)
block7a_se_squeeze (GlobalAver (None, 1152) 0 ['block7a_activation[0][0]']
agePooling2D)
block7a_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block7a_se_squeeze[0][0]']
block7a_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block7a_se_reshape[0][0]']
block7a_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block7a_se_reduce[0][0]']
block7a_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block7a_activation[0][0]',
'block7a_se_expand[0][0]']
block7a_project_conv (Conv2D) (None, 7, 7, 320) 368640 ['block7a_se_excite[0][0]']
block7a_project_bn (BatchNorma (None, 7, 7, 320) 1280 ['block7a_project_conv[0][0]']
lization)
top_conv (Conv2D) (None, 7, 7, 1280) 409600 ['block7a_project_bn[0][0]']
top_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['top_conv[0][0]']
top_activation (Activation) (None, 7, 7, 1280) 0 ['top_bn[0][0]']
global_average_pooling2d_3 (Gl (None, 1280) 0 ['top_activation[0][0]']
obalAveragePooling2D)
dense_9 (Dense) (None, 1024) 1311744 ['global_average_pooling2d_3[0][0
]']
dropout_8 (Dropout) (None, 1024) 0 ['dense_9[0][0]']
dense_10 (Dense) (None, 512) 524800 ['dropout_8[0][0]']
dropout_9 (Dropout) (None, 512) 0 ['dense_10[0][0]']
dense_11 (Dense) (None, 196) 100548 ['dropout_9[0][0]']
==================================================================================================
Total params: 5,986,663
Trainable params: 5,944,640
Non-trainable params: 42,023
__________________________________________________________________________________________________
# There are 8144 training images and 8041 test images in total
history = EffNet_model.fit_generator(training_set,
steps_per_epoch = int(8144/16),
epochs = 20,
validation_data = test_set,
validation_steps = int(8041/16),
callbacks=[reduce_lr])
C:\Users\adity\AppData\Local\Temp\ipykernel_12568\1977419751.py:2: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators. history = EffNet_model.fit_generator(training_set,
Epoch 1/20 509/509 [==============================] - 188s 353ms/step - loss: 5.2223 - accuracy: 0.0140 - val_loss: 5.2985 - val_accuracy: 0.0050 - lr: 1.0000e-04 Epoch 2/20 509/509 [==============================] - 180s 353ms/step - loss: 4.3564 - accuracy: 0.0970 - val_loss: 5.5479 - val_accuracy: 0.0097 - lr: 1.0000e-04 Epoch 3/20 509/509 [==============================] - 180s 353ms/step - loss: 3.0594 - accuracy: 0.2572 - val_loss: 3.3982 - val_accuracy: 0.2073 - lr: 1.0000e-04 Epoch 4/20 509/509 [==============================] - 181s 355ms/step - loss: 2.1975 - accuracy: 0.4249 - val_loss: 4.9143 - val_accuracy: 0.0710 - lr: 1.0000e-04 Epoch 5/20 509/509 [==============================] - ETA: 0s - loss: 1.6599 - accuracy: 0.5456 Epoch 5: ReduceLROnPlateau reducing learning rate to 1.9999999494757503e-05. 509/509 [==============================] - 180s 354ms/step - loss: 1.6599 - accuracy: 0.5456 - val_loss: 4.5935 - val_accuracy: 0.1058 - lr: 1.0000e-04 Epoch 6/20 509/509 [==============================] - 180s 354ms/step - loss: 1.2720 - accuracy: 0.6429 - val_loss: 1.9672 - val_accuracy: 0.4940 - lr: 2.0000e-05 Epoch 7/20 509/509 [==============================] - 181s 355ms/step - loss: 1.1676 - accuracy: 0.6667 - val_loss: 1.0346 - val_accuracy: 0.7103 - lr: 2.0000e-05 Epoch 8/20 509/509 [==============================] - 181s 355ms/step - loss: 1.0845 - accuracy: 0.6893 - val_loss: 1.3761 - val_accuracy: 0.6272 - lr: 2.0000e-05 Epoch 9/20 509/509 [==============================] - ETA: 0s - loss: 1.0078 - accuracy: 0.7092 Epoch 9: ReduceLROnPlateau reducing learning rate to 3.999999898951501e-06. 509/509 [==============================] - 181s 355ms/step - loss: 1.0078 - accuracy: 0.7092 - val_loss: 1.1418 - val_accuracy: 0.6775 - lr: 2.0000e-05 Epoch 10/20 509/509 [==============================] - 181s 355ms/step - loss: 0.9630 - accuracy: 0.7220 - val_loss: 0.8948 - val_accuracy: 0.7451 - lr: 4.0000e-06 Epoch 11/20 509/509 [==============================] - 181s 355ms/step - loss: 0.9543 - accuracy: 0.7313 - val_loss: 0.8883 - val_accuracy: 0.7510 - lr: 4.0000e-06 Epoch 12/20 509/509 [==============================] - ETA: 0s - loss: 0.9121 - accuracy: 0.7391 Epoch 12: ReduceLROnPlateau reducing learning rate to 7.999999979801942e-07. 509/509 [==============================] - 181s 354ms/step - loss: 0.9121 - accuracy: 0.7391 - val_loss: 0.8699 - val_accuracy: 0.7517 - lr: 4.0000e-06 Epoch 13/20 509/509 [==============================] - 181s 356ms/step - loss: 0.8946 - accuracy: 0.7409 - val_loss: 0.8668 - val_accuracy: 0.7529 - lr: 8.0000e-07 Epoch 14/20 509/509 [==============================] - ETA: 0s - loss: 0.9011 - accuracy: 0.7339 Epoch 14: ReduceLROnPlateau reducing learning rate to 1.600000018697756e-07. 509/509 [==============================] - 181s 356ms/step - loss: 0.9011 - accuracy: 0.7339 - val_loss: 0.8699 - val_accuracy: 0.7520 - lr: 8.0000e-07 Epoch 15/20 509/509 [==============================] - 181s 355ms/step - loss: 0.8837 - accuracy: 0.7499 - val_loss: 0.8676 - val_accuracy: 0.7529 - lr: 1.6000e-07 Epoch 16/20 509/509 [==============================] - ETA: 0s - loss: 0.8764 - accuracy: 0.7518 Epoch 16: ReduceLROnPlateau reducing learning rate to 3.199999980552093e-08. 509/509 [==============================] - 181s 355ms/step - loss: 0.8764 - accuracy: 0.7518 - val_loss: 0.8675 - val_accuracy: 0.7534 - lr: 1.6000e-07 Epoch 17/20 509/509 [==============================] - 181s 356ms/step - loss: 0.9024 - accuracy: 0.7388 - val_loss: 0.8611 - val_accuracy: 0.7544 - lr: 3.2000e-08 Epoch 18/20 509/509 [==============================] - ETA: 0s - loss: 0.8850 - accuracy: 0.7517 Epoch 18: ReduceLROnPlateau reducing learning rate to 1e-08. 509/509 [==============================] - 178s 350ms/step - loss: 0.8850 - accuracy: 0.7517 - val_loss: 0.8674 - val_accuracy: 0.7525 - lr: 3.2000e-08 Epoch 19/20 509/509 [==============================] - 180s 353ms/step - loss: 0.9000 - accuracy: 0.7457 - val_loss: 0.8634 - val_accuracy: 0.7535 - lr: 1.0000e-08 Epoch 20/20 509/509 [==============================] - 181s 355ms/step - loss: 0.9168 - accuracy: 0.7371 - val_loss: 0.8652 - val_accuracy: 0.7539 - lr: 1.0000e-08
# EffNet_model.save('./eff_model.h5')
EffNet_model.save_weights('./eff_model.h5')
## Accuracy and Loss plots
import matplotlib.pyplot as plt
accuracy = history.history['accuracy']
val_accuracy = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(len(accuracy)) # Get number of epochs
plt.plot (epochs, accuracy, label = 'training accuracy')
plt.plot (epochs, val_accuracy, label = 'validation accuracy')
plt.title ('Training and validation accuracy')
plt.legend(loc = 'lower right')
plt.figure()
plt.plot (epochs, loss, label = 'training loss')
plt.plot (epochs, val_loss, label = 'validation loss')
plt.legend(loc = 'upper right')
plt.title ('Training and validation loss')
Text(0.5, 1.0, 'Training and validation loss')
1) Accuracy Plot:
Training Accuracy: The training accuracy plot shows how well the model is performing on the training data over epochs. It measures the proportion of correctly classified samples in the training set. Ideally, we want to see an increasing trend, indicating that the model is learning the underlying patterns in the data.
Validation Accuracy: The validation accuracy plot indicates the model's performance on unseen data, usually a validation set. It shows how well the model is generalizing to new examples. A rising validation accuracy suggests that the model is not overfitting and is able to generalize to unseen data effectively.
2) Loss Plot:
Training Loss: The training loss plot shows the value of the loss function (in this case, categorical cross-entropy) on the training data over epochs. It measures the difference between the model's predicted probabilities and the actual labels. A decreasing training loss indicates that the model is improving in minimizing this difference.
Validation Loss: The validation loss plot measures the loss on the validation set. It provides insights into how well the model is generalizing to unseen data. A decreasing validation loss suggests that the model is not overfitting and is able to generalize well.
3) Overall Insights:
Model Performance: Look for consistency between the training and validation accuracy/loss. If the validation accuracy/loss starts diverging from the training accuracy/loss, it could indicate overfitting (if validation metrics worsen) or poor generalization (if validation metrics don't improve).
Model Convergence: Check if the training and validation metrics converge or stabilize after a certain number of epochs. Convergence indicates that the model has learned as much as it can from the data provided. If the metrics continue to improve, consider training for more epochs.
Hyperparameter Tuning: If the model's performance is not satisfactory, consider adjusting hyperparameters such as learning rate, dropout rate, or layer sizes. Fine-tuning these hyperparameters can significantly impact the model's performance.
# Performance of Efficient Net Model
# Re-initalizing the test data generator with shuffle=False to create the confusion matrix
import numpy as np
test_set = test_datagen.flow_from_directory('C:\\Users\\adity\\Downloads\\capstone\\Car Images\\Test Images\\',
target_size = (224, 224),
batch_size = 16,
shuffle=False,
class_mode = 'categorical')
# Predict the whole generator to get predictions
Y_pred = EffNet_model.predict_generator(test_set, int(8041/16+1))
# Find out the predictions classes with maximum probability
y_pred = np.argmax(Y_pred, axis=1)
# Utilities for confusion matrix
from sklearn.metrics import classification_report, confusion_matrix
# Printing the confusion matrix based on the actual data vs predicted data.
print(confusion_matrix(test_set.classes, y_pred))
# Printing the classification report
print(classification_report(test_set.classes, y_pred, target_names=prediction_class))
Found 8041 images belonging to 196 classes.
C:\Users\adity\AppData\Local\Temp\ipykernel_12568\1874819283.py:11: UserWarning: `Model.predict_generator` is deprecated and will be removed in a future version. Please use `Model.predict`, which supports generators. Y_pred = EffNet_model.predict_generator(test_set, int(8041/16+1))
[[42 0 0 ... 0 0 0]
[ 0 32 0 ... 0 0 0]
[ 0 0 11 ... 0 0 0]
...
[ 0 0 0 ... 33 0 0]
[ 0 0 0 ... 0 29 0]
[ 0 0 0 ... 0 0 32]]
precision recall f1-score support
AM General Hummer SUV 2000 0.78 0.95 0.86 44
Acura Integra Type R 2001 0.70 0.73 0.71 44
Acura RL Sedan 2012 0.52 0.34 0.42 32
Acura TL Sedan 2012 0.68 0.88 0.77 43
Acura TL Type-S 2008 0.71 0.69 0.70 42
Acura TSX Sedan 2012 0.69 0.55 0.61 40
Acura ZDX Hatchback 2012 0.76 0.79 0.78 39
Aston Martin V8 Vantage Convertible 2012 0.51 0.67 0.58 45
Aston Martin V8 Vantage Coupe 2012 0.61 0.41 0.49 41
Aston Martin Virage Convertible 2012 0.65 0.39 0.49 33
Aston Martin Virage Coupe 2012 0.75 0.79 0.77 38
Audi 100 Sedan 1994 0.59 0.72 0.65 40
Audi 100 Wagon 1994 0.81 0.71 0.76 42
Audi A5 Coupe 2012 0.65 0.80 0.72 41
Audi R8 Coupe 2012 0.86 0.84 0.85 43
Audi RS 4 Convertible 2008 0.76 0.72 0.74 36
Audi S4 Sedan 2007 0.53 0.69 0.60 45
Audi S4 Sedan 2012 0.29 0.18 0.22 39
Audi S5 Convertible 2012 0.70 0.62 0.66 42
Audi S5 Coupe 2012 0.39 0.33 0.36 42
Audi S6 Sedan 2011 0.59 0.59 0.59 46
Audi TT Hatchback 2011 0.43 0.45 0.44 40
Audi TT RS Coupe 2012 0.68 0.59 0.63 39
Audi TTS Coupe 2012 0.66 0.50 0.57 42
Audi V8 Sedan 1994 0.53 0.49 0.51 43
BMW 1 Series Convertible 2012 0.74 0.57 0.65 35
BMW 1 Series Coupe 2012 0.78 0.88 0.83 41
BMW 3 Series Sedan 2012 0.63 0.64 0.64 42
BMW 3 Series Wagon 2012 0.59 0.78 0.67 41
BMW 6 Series Convertible 2007 0.65 0.64 0.64 44
BMW ActiveHybrid 5 Sedan 2012 0.67 0.82 0.74 34
BMW M3 Coupe 2012 0.60 0.80 0.69 44
BMW M5 Sedan 2010 0.57 0.71 0.63 41
BMW M6 Convertible 2010 0.42 0.59 0.49 41
BMW X3 SUV 2012 0.91 0.79 0.85 38
BMW X5 SUV 2007 0.48 0.80 0.60 41
BMW X6 SUV 2012 0.85 0.67 0.75 42
BMW Z4 Convertible 2012 0.76 0.62 0.68 40
Bentley Arnage Sedan 2009 0.62 0.90 0.74 39
Bentley Continental Flying Spur Sedan 2007 0.62 0.91 0.73 44
Bentley Continental GT Coupe 2007 0.56 0.30 0.39 46
Bentley Continental GT Coupe 2012 0.62 0.62 0.62 34
Bentley Continental Supersports Conv. Convertible 2012 0.84 0.75 0.79 36
Bentley Mulsanne Sedan 2011 0.77 0.69 0.73 35
Bugatti Veyron 16.4 Convertible 2009 0.74 0.78 0.76 32
Bugatti Veyron 16.4 Coupe 2009 0.76 0.86 0.80 43
Buick Enclave SUV 2012 0.77 0.86 0.81 42
Buick Rainier SUV 2007 0.72 0.86 0.78 42
Buick Regal GS 2012 0.82 0.77 0.79 35
Buick Verano Sedan 2012 0.64 0.86 0.74 37
Cadillac CTS-V Sedan 2012 0.93 1.00 0.97 43
Cadillac Escalade EXT Crew Cab 2007 0.83 0.77 0.80 44
Cadillac SRX SUV 2012 0.81 0.93 0.86 41
Chevrolet Avalanche Crew Cab 2012 0.74 0.71 0.73 45
Chevrolet Camaro Convertible 2012 0.86 0.70 0.78 44
Chevrolet Cobalt SS 2010 0.83 0.59 0.69 41
Chevrolet Corvette Convertible 2012 0.62 0.77 0.69 39
Chevrolet Corvette Ron Fellows Edition Z06 2007 0.85 0.89 0.87 37
Chevrolet Corvette ZR1 2012 0.95 0.76 0.84 46
Chevrolet Express Cargo Van 2007 0.52 0.59 0.55 29
Chevrolet Express Van 2007 0.44 0.20 0.27 35
Chevrolet HHR SS 2010 0.97 0.92 0.94 36
Chevrolet Impala Sedan 2007 0.76 0.65 0.70 43
Chevrolet Malibu Hybrid Sedan 2010 0.74 0.66 0.69 38
Chevrolet Malibu Sedan 2007 0.63 0.70 0.67 44
Chevrolet Monte Carlo Coupe 2007 0.71 0.64 0.67 45
Chevrolet Silverado 1500 Classic Extended Cab 2007 0.84 0.88 0.86 42
Chevrolet Silverado 1500 Extended Cab 2012 0.47 0.49 0.48 43
Chevrolet Silverado 1500 Hybrid Crew Cab 2012 0.58 0.45 0.51 40
Chevrolet Silverado 1500 Regular Cab 2012 0.60 0.77 0.67 44
Chevrolet Silverado 2500HD Regular Cab 2012 0.59 0.53 0.56 38
Chevrolet Sonic Sedan 2012 0.78 0.66 0.72 44
Chevrolet Tahoe Hybrid SUV 2012 0.68 0.51 0.58 37
Chevrolet TrailBlazer SS 2009 0.85 0.85 0.85 40
Chevrolet Traverse SUV 2012 0.81 0.80 0.80 44
Chrysler 300 SRT-8 2010 0.72 0.81 0.76 48
Chrysler Aspen SUV 2009 0.86 0.84 0.85 43
Chrysler Crossfire Convertible 2008 0.91 0.93 0.92 43
Chrysler PT Cruiser Convertible 2008 0.98 0.93 0.95 45
Chrysler Sebring Convertible 2010 0.88 0.75 0.81 40
Chrysler Town and Country Minivan 2012 0.87 0.73 0.79 37
Daewoo Nubira Wagon 2002 0.80 0.80 0.80 45
Dodge Caliber Wagon 2007 0.73 0.86 0.79 42
Dodge Caliber Wagon 2012 0.64 0.68 0.66 40
Dodge Caravan Minivan 1997 0.93 0.95 0.94 43
Dodge Challenger SRT8 2011 0.84 0.95 0.89 39
Dodge Charger SRT-8 2009 0.68 0.71 0.70 42
Dodge Charger Sedan 2012 0.56 0.49 0.52 41
Dodge Dakota Club Cab 2007 0.68 0.79 0.73 38
Dodge Dakota Crew Cab 2010 0.88 0.73 0.80 41
Dodge Durango SUV 2007 0.86 0.84 0.85 45
Dodge Durango SUV 2012 0.84 0.88 0.86 43
Dodge Journey SUV 2012 0.93 0.84 0.88 44
Dodge Magnum Wagon 2008 0.80 0.88 0.83 40
Dodge Ram Pickup 3500 Crew Cab 2010 0.87 0.81 0.84 42
Dodge Ram Pickup 3500 Quad Cab 2009 0.64 0.68 0.66 44
Dodge Sprinter Cargo Van 2009 0.81 0.54 0.65 39
Eagle Talon Hatchback 1998 0.84 0.67 0.75 46
FIAT 500 Abarth 2012 0.77 1.00 0.87 27
FIAT 500 Convertible 2012 0.85 0.88 0.87 33
Ferrari 458 Italia Convertible 2012 0.63 0.74 0.68 39
Ferrari 458 Italia Coupe 2012 0.74 0.60 0.66 42
Ferrari California Convertible 2012 0.76 0.72 0.74 39
Ferrari FF Coupe 2012 0.82 0.74 0.78 42
Fisker Karma Sedan 2012 0.76 0.74 0.75 43
Ford E-Series Wagon Van 2012 0.97 0.92 0.94 37
Ford Edge SUV 2012 0.82 0.84 0.83 43
Ford Expedition EL SUV 2009 0.87 0.89 0.88 44
Ford F-150 Regular Cab 2007 0.73 0.84 0.78 45
Ford F-150 Regular Cab 2012 0.93 0.90 0.92 42
Ford F-450 Super Duty Crew Cab 2012 0.93 0.95 0.94 41
Ford Fiesta Sedan 2012 0.84 0.86 0.85 42
Ford Focus Sedan 2007 0.80 0.82 0.81 45
Ford Freestar Minivan 2007 0.88 0.95 0.91 44
Ford GT Coupe 2006 0.75 0.80 0.77 45
Ford Mustang Convertible 2007 0.90 0.61 0.73 44
Ford Ranger SuperCab 2011 0.88 0.90 0.89 42
GMC Acadia SUV 2012 0.83 0.80 0.81 44
GMC Canyon Extended Cab 2012 0.91 0.75 0.82 40
GMC Savana Van 2012 0.71 0.87 0.78 68
GMC Terrain SUV 2012 0.79 0.90 0.84 41
GMC Yukon Hybrid SUV 2012 0.72 0.67 0.69 42
Geo Metro Convertible 1993 0.86 0.86 0.86 44
HUMMER H2 SUT Crew Cab 2009 0.67 0.79 0.72 43
HUMMER H3T Crew Cab 2010 0.80 0.51 0.62 39
Honda Accord Coupe 2012 0.79 0.67 0.72 39
Honda Accord Sedan 2012 0.69 0.71 0.70 38
Honda Odyssey Minivan 2007 0.64 0.85 0.73 41
Honda Odyssey Minivan 2012 0.67 0.93 0.78 42
Hyundai Accent Sedan 2012 0.77 0.42 0.54 24
Hyundai Azera Sedan 2012 0.79 0.62 0.69 42
Hyundai Elantra Sedan 2007 0.76 0.76 0.76 42
Hyundai Elantra Touring Hatchback 2012 0.82 0.79 0.80 42
Hyundai Genesis Sedan 2012 0.85 0.77 0.80 43
Hyundai Santa Fe SUV 2012 0.92 0.83 0.88 42
Hyundai Sonata Hybrid Sedan 2012 0.70 1.00 0.82 33
Hyundai Sonata Sedan 2012 0.76 0.87 0.81 39
Hyundai Tucson SUV 2012 0.84 0.98 0.90 43
Hyundai Veloster Hatchback 2012 0.84 0.66 0.74 41
Hyundai Veracruz SUV 2012 0.71 0.57 0.63 42
Infiniti G Coupe IPL 2012 0.77 0.71 0.74 34
Infiniti QX56 SUV 2011 0.97 0.88 0.92 32
Isuzu Ascender SUV 2008 0.94 0.85 0.89 40
Jaguar XK XKR 2012 0.65 0.52 0.58 46
Jeep Compass SUV 2012 0.78 0.83 0.80 42
Jeep Grand Cherokee SUV 2012 0.86 0.56 0.68 45
Jeep Liberty SUV 2012 0.92 0.82 0.87 44
Jeep Patriot SUV 2012 0.78 0.86 0.82 44
Jeep Wrangler SUV 2012 0.93 0.98 0.95 43
Lamborghini Aventador Coupe 2012 0.78 0.74 0.76 43
Lamborghini Diablo Coupe 2001 0.84 0.82 0.83 44
Lamborghini Gallardo LP 570-4 Superleggera 2012 0.84 0.77 0.81 35
Lamborghini Reventon Coupe 2008 0.63 0.89 0.74 36
Land Rover LR2 SUV 2012 0.86 0.76 0.81 42
Land Rover Range Rover SUV 2012 0.77 0.95 0.85 42
Lincoln Town Car Sedan 2011 0.85 0.85 0.85 39
MINI Cooper Roadster Convertible 2012 0.91 0.89 0.90 36
Maybach Landaulet Convertible 2012 0.56 0.97 0.71 29
Mazda Tribute SUV 2011 0.86 0.67 0.75 36
McLaren MP4-12C Coupe 2012 0.85 0.91 0.88 44
Mercedes-Benz 300-Class Convertible 1993 0.83 0.90 0.86 48
Mercedes-Benz C-Class Sedan 2012 0.84 0.82 0.83 45
Mercedes-Benz E-Class Sedan 2012 0.74 0.65 0.69 43
Mercedes-Benz S-Class Sedan 2012 0.68 0.86 0.76 44
Mercedes-Benz SL-Class Coupe 2009 0.86 0.83 0.85 36
Mercedes-Benz Sprinter Van 2012 0.68 0.98 0.80 41
Mitsubishi Lancer Sedan 2012 0.92 0.70 0.80 47
Nissan 240SX Coupe 1998 0.91 0.93 0.92 46
Nissan Juke Hatchback 2012 0.81 0.80 0.80 44
Nissan Leaf Hatchback 2012 0.91 0.95 0.93 42
Nissan NV Passenger Van 2012 0.87 0.89 0.88 38
Plymouth Neon Coupe 1999 0.89 0.93 0.91 44
Porsche Panamera Sedan 2012 0.63 0.67 0.65 43
Ram C-V Cargo Van Minivan 2012 0.90 0.68 0.78 41
Rolls-Royce Ghost Sedan 2012 0.80 0.74 0.77 38
Rolls-Royce Phantom Drophead Coupe Convertible 2012 0.81 0.70 0.75 30
Rolls-Royce Phantom Sedan 2012 0.73 0.80 0.76 44
Scion xD Hatchback 2012 0.83 0.59 0.69 41
Spyker C8 Convertible 2009 0.73 0.78 0.75 45
Spyker C8 Coupe 2009 0.81 0.62 0.70 42
Suzuki Aerio Sedan 2007 0.78 0.74 0.76 38
Suzuki Kizashi Sedan 2012 0.77 0.72 0.74 46
Suzuki SX4 Hatchback 2012 0.90 0.83 0.86 42
Suzuki SX4 Sedan 2012 0.58 0.55 0.56 40
Tesla Model S Sedan 2012 0.63 0.84 0.72 38
Toyota 4Runner SUV 2012 0.83 0.88 0.85 40
Toyota Camry Sedan 2012 0.75 0.91 0.82 43
Toyota Corolla Sedan 2012 0.84 0.63 0.72 43
Toyota Sequoia SUV 2012 0.94 0.82 0.87 38
Volkswagen Beetle Hatchback 2012 0.90 0.88 0.89 42
Volkswagen Golf Hatchback 1991 0.79 0.98 0.87 46
Volkswagen Golf Hatchback 2012 0.78 0.58 0.67 43
Volvo 240 Sedan 1993 0.86 0.96 0.91 45
Volvo C30 Hatchback 2012 0.92 0.80 0.86 41
Volvo XC90 SUV 2007 0.91 0.67 0.77 43
smart fortwo Convertible 2012 0.86 0.80 0.83 40
accuracy 0.75 8041
macro avg 0.76 0.75 0.75 8041
weighted avg 0.76 0.75 0.75 8041
img_size = 224
num_class = 196
def batch_generator(df, model_used="efficientnet", batch_size=32):
while True:
#Create indexes
image_nums = np.random.randint(0,df.shape[0], size=batch_size)
#Create empty arrays
#1. To hold image input
batch_images = np.zeros(shape=(batch_size, img_size, img_size, 3))
#Classification Labels
batch_labels = np.zeros(shape=(batch_size, num_class))
#Regression labels - 4 numbers per example image
batch_bboxes = np.zeros(shape=(batch_size, 4))
for i in range(batch_size):
#Read image and resize
img = tf.keras.preprocessing.image.load_img(df.loc[image_nums[i], 'path'],
target_size=(img_size, img_size))
#Convert to numpy array
img_array = tf.keras.preprocessing.image.img_to_array(img)
#Update batch
batch_images[i] = img_array
#Read image classification label & convert to one hot vector
# cl_label = master_df.loc[image_nums[i], 'label']
cl_label = df['label']
cl_label = tf.keras.utils.to_categorical(np.asarray(cl_label.factorize()[0]), num_classes=num_class)
batch_labels[i] = cl_label[i]
# to_categorical(np.asarray(y_train.factorize()[0]))
#Read and resize bounding box co-ordinates
img_width = df.loc[image_nums[i], 'width']
img_height = df.loc[image_nums[i], 'height']
xmin = df.loc[image_nums[i], 'x_min'] * img_size/img_width
xmax = df.loc[image_nums[i], 'x_max'] * img_size/img_width
ymin = df.loc[image_nums[i], 'y_min'] * img_size/img_height
ymax = df.loc[image_nums[i], 'y_max'] * img_size/img_height
#We will ask model to predict xmin, ymin, width and height of bounding box
batch_bboxes[i] = [xmin, ymin, xmax-xmin, ymax-ymin]
#Normalize batch images as per Pre-trained model to be used
for i in range(batch_size):
if model_used == "resnet":
batch_images[i] = tf.keras.applications.resnet.preprocess_input(batch_images[i])
elif model_used == "mobilenet":
batch_images[i] = tf.keras.applications.mobilenet.preprocess_input(batch_images[i])
elif model_used == "efficientnet":
batch_images[i] = tf.keras.applications.efficientnet.preprocess_input(batch_images[i])
#Make bounding boxes (x, y, w, h) as numbers between 0 and 1 - this seems to work better
batch_bboxes = batch_bboxes/img_size
#Return batch - use yield function to make it a python generator
yield batch_images, [batch_labels, batch_bboxes]
# Creating test an train dataframes from master dataframe
train_df = master_df[master_df['source']=='train'].reset_index().drop(columns=['index'])
test_df = master_df[master_df['source']=='test'].reset_index().drop(columns=['index'])
from sklearn import preprocessing
from keras.utils.np_utils import to_categorical
label_encoder = preprocessing.LabelEncoder()
train_df['label_encode'] = label_encoder.fit_transform(train_df['label'])
test_df['label_encode'] = label_encoder.fit_transform(test_df['label'])
train_df.columns
Index(['id', 'image', 'label', 'dataset', 'height', 'width', 'n_channels',
'path', 'source', 'x_min', 'y_min', 'x_max', 'y_max', 'Image class',
'model_year', 'label_encode'],
dtype='object')
gen = batch_generator(train_df, batch_size=2, model_used="mobilenet")
X, y = next(gen)
print(X.shape)
print(y[0].shape, y[1].shape)
print(y)
ll1111 = []
# label_encoder.inverse_transform(y[0][0])
for i in range(0,len(y[0])):
y1 = int(np.argmax(y[0][i], axis=-1))
ll1111.extend([y1])
print(len(ll1111))
print(label_encoder.inverse_transform(ll1111))
plt.figure(figsize =(10, 4))
#subplot(r,c) provide the no. of rows and columns
f, axarr = plt.subplots(len(y[0]),1)
for i in range(0,len(y[0])):
axarr[i].imshow(X[i])
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
(2, 224, 224, 3)
(2, 196) (2, 4)
[array([[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0.],
[0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0.]]), array([[0.058 , 0.0960961 , 0.886 , 0.8978979 ],
[0.03333333, 0.25906183, 0.916 , 0.68230277]])]
2
['AM General Hummer SUV 2000' 'Acura Integra Type R 2001']
<Figure size 1000x400 with 0 Axes>
gen = batch_generator(test_df, batch_size=2, model_used="mobilenet")
X, y = next(gen)
print(X.shape)
print(y[0].shape, y[1].shape)
print(y)
ll1111 = []
# label_encoder.inverse_transform(y[0][0])
for i in range(0,len(y[0])):
y1 = int(np.argmax(y[0][i], axis=-1))
ll1111.extend([y1])
print(len(ll1111))
print(label_encoder.inverse_transform(ll1111))
plt.figure(figsize =(10, 4))
#subplot(r,c) provide the no. of rows and columns
f, axarr = plt.subplots(len(y[0]),1)
for i in range(0,len(y[0])):
axarr[i].imshow(X[i])
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
(2, 224, 224, 3)
(2, 196) (2, 4)
[array([[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0.],
[0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0.]]), array([[0.04333333, 0.40043764, 0.83777778, 0.47921225],
[0.035 , 0.215 , 0.9375 , 0.61666667]])]
2
['AM General Hummer SUV 2000' 'Acura Integra Type R 2001']
<Figure size 1000x400 with 0 Axes>
def calculate_iou(y_true, y_pred):
"""
Input:
Keras provides the input as numpy arrays with shape (batch_size, num_columns).
Arguments:
y_true -- first box, numpy array with format [x, y, width, height, conf_score]
y_pred -- second box, numpy array with format [x, y, width, height, conf_score]
x any y are the coordinates of the top left corner of each box.
Output: IoU of type float32. (This is a ratio. Max is 1. Min is 0.)
"""
results = []
for i in range(0,y_true.shape[0]):
# set the types so we are sure what type we are using
y_true = np.array(y_true, dtype=np.float32)
y_pred = np.array(y_pred, dtype=np.float32)
#print(y_true.shape)
#print(y_pred.shape)
# boxTrue
x_boxTrue_tleft = y_true[i,0] # numpy index selection
y_boxTrue_tleft = y_true[i,1]
boxTrue_width = y_true[i,2]
boxTrue_height = y_true[i,3]
area_boxTrue = (boxTrue_width * boxTrue_height)
# boxPred
x_boxPred_tleft = y_pred[i,0]
y_boxPred_tleft = y_pred[i,1]
boxPred_width = y_pred[i,2]
boxPred_height = y_pred[i,3]
area_boxPred = (boxPred_width * boxPred_height)
# calculate the bottom right coordinates for boxTrue and boxPred
# boxTrue
x_boxTrue_br = x_boxTrue_tleft + boxTrue_width
y_boxTrue_br = y_boxTrue_tleft + boxTrue_height # Version 2 revision
# boxPred
x_boxPred_br = x_boxPred_tleft + boxPred_width
y_boxPred_br = y_boxPred_tleft + boxPred_height # Version 2 revision
# calculate the top left and bottom right coordinates for the intersection box, boxInt
# boxInt - top left coords
x_boxInt_tleft = np.max([x_boxTrue_tleft,x_boxPred_tleft])
y_boxInt_tleft = np.max([y_boxTrue_tleft,y_boxPred_tleft]) # Version 2 revision
# boxInt - bottom right coords
x_boxInt_br = np.min([x_boxTrue_br,x_boxPred_br])
y_boxInt_br = np.min([y_boxTrue_br,y_boxPred_br])
# Calculate the area of boxInt, i.e. the area of the intersection
# between boxTrue and boxPred.
# The np.max() function forces the intersection area to 0 if the boxes don't overlap.
# Version 2 revision
area_of_intersection = \
np.max([0,(x_boxInt_br - x_boxInt_tleft)]) * np.max([0,(y_boxInt_br - y_boxInt_tleft)])
iou = area_of_intersection / ((area_boxTrue + area_boxPred) - area_of_intersection)
# This must match the type used in py_func
iou = np.array(iou, dtype=np.float32)
# append the result to a list at the end of each loop
results.append(iou)
# return the mean IoU score for the batch
return np.mean(results)
def IoU(y_true, y_pred):
# Note: the type float32 is very important. It must be the same type as the output from
# the python function above or you too may spend many late night hours
# trying to debug and almost give up.
iou = tf.py_function(calculate_iou, [y_true, y_pred], tf.float32)
return iou
Mobile Net with Regression¶
# Mobile Net with Faster RCNN
from tensorflow.keras.applications.mobilenet import MobileNet
model = MobileNet(input_shape=(224, 224, 3), include_top=False, weights='imagenet')
#Freeze all the layers
for layer in model.layers:
layer.trainable = True
#get Output layer of Pre-trained model
x1 = model.output
#Pooling layer
x2 = GlobalAveragePooling2D()(x1)
#Flatten the output to feed to Dense layer
x3 = Dropout(0.2)(x2)
#Add one Dense layer
x4_1 = Dense(1024, activation='relu')(x3)
x4_2 = Dropout(0.15)(x4_1)
#Add one Dense layer
x4_3 = Dense(512, activation='relu')(x4_2)
x4_4 = Dropout(0.05)(x4_3)
#Add one Dense layer
x4 = Dense(256, activation='relu')(x4_4)
#Batch Norm
x5 = BatchNormalization()(x4)
#Print the number of classes
print(num_class)
#Classification
label_output = Dense(num_class, activation='softmax', name='class_op')(x5)
196
model.summary()
Model: "mobilenet_1.00_224"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 224, 224, 3)] 0
conv1 (Conv2D) (None, 112, 112, 32) 864
conv1_bn (BatchNormalizatio (None, 112, 112, 32) 128
n)
conv1_relu (ReLU) (None, 112, 112, 32) 0
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288
conv_dw_1_bn (BatchNormaliz (None, 112, 112, 32) 128
ation)
conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0
conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048
conv_pw_1_bn (BatchNormaliz (None, 112, 112, 64) 256
ation)
conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576
conv_dw_2_bn (BatchNormaliz (None, 56, 56, 64) 256
ation)
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192
conv_pw_2_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152
conv_dw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384
conv_pw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152
conv_dw_4_bn (BatchNormaliz (None, 28, 28, 128) 512
ation)
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768
conv_pw_4_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304
conv_dw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536
conv_pw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304
conv_dw_6_bn (BatchNormaliz (None, 14, 14, 256) 1024
ation)
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072
conv_pw_6_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_10 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_11 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0
conv_dw_12 (DepthwiseConv2D (None, 7, 7, 512) 4608
)
conv_dw_12_bn (BatchNormali (None, 7, 7, 512) 2048
zation)
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288
conv_pw_12_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0
conv_dw_13 (DepthwiseConv2D (None, 7, 7, 1024) 9216
)
conv_dw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576
conv_pw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0
=================================================================
Total params: 3,228,864
Trainable params: 3,206,976
Non-trainable params: 21,888
_________________________________________________________________
len(model.layers)
86
model.output
<KerasTensor: shape=(None, 7, 7, 1024) dtype=float32 (created by layer 'conv_pw_13_relu')>
label_output
<KerasTensor: shape=(None, 196) dtype=float32 (created by layer 'class_op')>
#Regression
bbox_output = Dense(4, activation='sigmoid', name='reg_op')(x5)
bbox_output
<KerasTensor: shape=(None, 4) dtype=float32 (created by layer 'reg_op')>
#Non Sequential model as it has two different outputs
final_model = Model(inputs=model.input, #Pre-trained model input as input layer
outputs=[label_output,bbox_output]) #Output layer added
final_model.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
conv1 (Conv2D) (None, 112, 112, 32 864 ['input_1[0][0]']
)
conv1_bn (BatchNormalization) (None, 112, 112, 32 128 ['conv1[0][0]']
)
conv1_relu (ReLU) (None, 112, 112, 32 0 ['conv1_bn[0][0]']
)
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32 288 ['conv1_relu[0][0]']
)
conv_dw_1_bn (BatchNormalizati (None, 112, 112, 32 128 ['conv_dw_1[0][0]']
on) )
conv_dw_1_relu (ReLU) (None, 112, 112, 32 0 ['conv_dw_1_bn[0][0]']
)
conv_pw_1 (Conv2D) (None, 112, 112, 64 2048 ['conv_dw_1_relu[0][0]']
)
conv_pw_1_bn (BatchNormalizati (None, 112, 112, 64 256 ['conv_pw_1[0][0]']
on) )
conv_pw_1_relu (ReLU) (None, 112, 112, 64 0 ['conv_pw_1_bn[0][0]']
)
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64 0 ['conv_pw_1_relu[0][0]']
)
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576 ['conv_pad_2[0][0]']
conv_dw_2_bn (BatchNormalizati (None, 56, 56, 64) 256 ['conv_dw_2[0][0]']
on)
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0 ['conv_dw_2_bn[0][0]']
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192 ['conv_dw_2_relu[0][0]']
conv_pw_2_bn (BatchNormalizati (None, 56, 56, 128) 512 ['conv_pw_2[0][0]']
on)
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0 ['conv_pw_2_bn[0][0]']
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152 ['conv_pw_2_relu[0][0]']
conv_dw_3_bn (BatchNormalizati (None, 56, 56, 128) 512 ['conv_dw_3[0][0]']
on)
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0 ['conv_dw_3_bn[0][0]']
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384 ['conv_dw_3_relu[0][0]']
conv_pw_3_bn (BatchNormalizati (None, 56, 56, 128) 512 ['conv_pw_3[0][0]']
on)
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0 ['conv_pw_3_bn[0][0]']
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0 ['conv_pw_3_relu[0][0]']
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152 ['conv_pad_4[0][0]']
conv_dw_4_bn (BatchNormalizati (None, 28, 28, 128) 512 ['conv_dw_4[0][0]']
on)
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0 ['conv_dw_4_bn[0][0]']
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768 ['conv_dw_4_relu[0][0]']
conv_pw_4_bn (BatchNormalizati (None, 28, 28, 256) 1024 ['conv_pw_4[0][0]']
on)
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0 ['conv_pw_4_bn[0][0]']
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304 ['conv_pw_4_relu[0][0]']
conv_dw_5_bn (BatchNormalizati (None, 28, 28, 256) 1024 ['conv_dw_5[0][0]']
on)
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0 ['conv_dw_5_bn[0][0]']
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536 ['conv_dw_5_relu[0][0]']
conv_pw_5_bn (BatchNormalizati (None, 28, 28, 256) 1024 ['conv_pw_5[0][0]']
on)
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0 ['conv_pw_5_bn[0][0]']
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0 ['conv_pw_5_relu[0][0]']
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304 ['conv_pad_6[0][0]']
conv_dw_6_bn (BatchNormalizati (None, 14, 14, 256) 1024 ['conv_dw_6[0][0]']
on)
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0 ['conv_dw_6_bn[0][0]']
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072 ['conv_dw_6_relu[0][0]']
conv_pw_6_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_pw_6[0][0]']
on)
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_6_bn[0][0]']
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608 ['conv_pw_6_relu[0][0]']
conv_dw_7_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_dw_7[0][0]']
on)
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0 ['conv_dw_7_bn[0][0]']
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144 ['conv_dw_7_relu[0][0]']
conv_pw_7_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_pw_7[0][0]']
on)
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_7_bn[0][0]']
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608 ['conv_pw_7_relu[0][0]']
conv_dw_8_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_dw_8[0][0]']
on)
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0 ['conv_dw_8_bn[0][0]']
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144 ['conv_dw_8_relu[0][0]']
conv_pw_8_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_pw_8[0][0]']
on)
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_8_bn[0][0]']
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608 ['conv_pw_8_relu[0][0]']
conv_dw_9_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_dw_9[0][0]']
on)
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0 ['conv_dw_9_bn[0][0]']
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144 ['conv_dw_9_relu[0][0]']
conv_pw_9_bn (BatchNormalizati (None, 14, 14, 512) 2048 ['conv_pw_9[0][0]']
on)
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_9_bn[0][0]']
conv_dw_10 (DepthwiseConv2D) (None, 14, 14, 512) 4608 ['conv_pw_9_relu[0][0]']
conv_dw_10_bn (BatchNormalizat (None, 14, 14, 512) 2048 ['conv_dw_10[0][0]']
ion)
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0 ['conv_dw_10_bn[0][0]']
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144 ['conv_dw_10_relu[0][0]']
conv_pw_10_bn (BatchNormalizat (None, 14, 14, 512) 2048 ['conv_pw_10[0][0]']
ion)
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_10_bn[0][0]']
conv_dw_11 (DepthwiseConv2D) (None, 14, 14, 512) 4608 ['conv_pw_10_relu[0][0]']
conv_dw_11_bn (BatchNormalizat (None, 14, 14, 512) 2048 ['conv_dw_11[0][0]']
ion)
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0 ['conv_dw_11_bn[0][0]']
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144 ['conv_dw_11_relu[0][0]']
conv_pw_11_bn (BatchNormalizat (None, 14, 14, 512) 2048 ['conv_pw_11[0][0]']
ion)
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0 ['conv_pw_11_bn[0][0]']
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0 ['conv_pw_11_relu[0][0]']
conv_dw_12 (DepthwiseConv2D) (None, 7, 7, 512) 4608 ['conv_pad_12[0][0]']
conv_dw_12_bn (BatchNormalizat (None, 7, 7, 512) 2048 ['conv_dw_12[0][0]']
ion)
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0 ['conv_dw_12_bn[0][0]']
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288 ['conv_dw_12_relu[0][0]']
conv_pw_12_bn (BatchNormalizat (None, 7, 7, 1024) 4096 ['conv_pw_12[0][0]']
ion)
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0 ['conv_pw_12_bn[0][0]']
conv_dw_13 (DepthwiseConv2D) (None, 7, 7, 1024) 9216 ['conv_pw_12_relu[0][0]']
conv_dw_13_bn (BatchNormalizat (None, 7, 7, 1024) 4096 ['conv_dw_13[0][0]']
ion)
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0 ['conv_dw_13_bn[0][0]']
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576 ['conv_dw_13_relu[0][0]']
conv_pw_13_bn (BatchNormalizat (None, 7, 7, 1024) 4096 ['conv_pw_13[0][0]']
ion)
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0 ['conv_pw_13_bn[0][0]']
global_average_pooling2d (Glob (None, 1024) 0 ['conv_pw_13_relu[0][0]']
alAveragePooling2D)
dropout (Dropout) (None, 1024) 0 ['global_average_pooling2d[0][0]'
]
dense (Dense) (None, 1024) 1049600 ['dropout[0][0]']
dropout_1 (Dropout) (None, 1024) 0 ['dense[0][0]']
dense_1 (Dense) (None, 512) 524800 ['dropout_1[0][0]']
dropout_2 (Dropout) (None, 512) 0 ['dense_1[0][0]']
dense_2 (Dense) (None, 256) 131328 ['dropout_2[0][0]']
batch_normalization (BatchNorm (None, 256) 1024 ['dense_2[0][0]']
alization)
class_op (Dense) (None, 196) 50372 ['batch_normalization[0][0]']
reg_op (Dense) (None, 4) 1028 ['batch_normalization[0][0]']
==================================================================================================
Total params: 4,987,016
Trainable params: 4,964,616
Non-trainable params: 22,400
__________________________________________________________________________________________________
optimizerVar = Adam(lr=0.0001)
final_model.compile(optimizer=optimizerVar,
loss={'reg_op':'mse', 'class_op':'categorical_crossentropy'},
metrics={'reg_op':[IoU], 'class_op':['accuracy']})
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. super().__init__(name, **kwargs)
#Create train and test generator
batchsize = 16
train_generator = batch_generator(train_df, batch_size=batchsize, model_used='mobilenet') #batchsize can be changed
test_generator = batch_generator(test_df, batch_size=batchsize, model_used='mobilenet')
history_1 = final_model.fit(train_generator,
epochs=20,
steps_per_epoch= train_df.shape[0]//batchsize,
validation_data=test_generator,
validation_steps = test_df.shape[0]//batchsize)
Epoch 1/20 509/509 [==============================] - 130s 234ms/step - loss: 5.4988 - class_op_loss: 5.3736 - reg_op_loss: 0.1251 - class_op_accuracy: 0.0248 - reg_op_IoU: 0.1835 - val_loss: 4.9699 - val_class_op_loss: 4.9315 - val_reg_op_loss: 0.0385 - val_class_op_accuracy: 0.0647 - val_reg_op_IoU: 0.4362 Epoch 2/20 509/509 [==============================] - 119s 234ms/step - loss: 4.3058 - class_op_loss: 4.2691 - reg_op_loss: 0.0367 - class_op_accuracy: 0.0608 - reg_op_IoU: 0.4728 - val_loss: 3.5415 - val_class_op_loss: 3.5300 - val_reg_op_loss: 0.0116 - val_class_op_accuracy: 0.0600 - val_reg_op_IoU: 0.6797 Epoch 3/20 509/509 [==============================] - 119s 234ms/step - loss: 3.1601 - class_op_loss: 3.1479 - reg_op_loss: 0.0122 - class_op_accuracy: 0.0593 - reg_op_IoU: 0.6803 - val_loss: 2.9255 - val_class_op_loss: 2.9146 - val_reg_op_loss: 0.0109 - val_class_op_accuracy: 0.0621 - val_reg_op_IoU: 0.7134 Epoch 4/20 509/509 [==============================] - 127s 249ms/step - loss: 2.8902 - class_op_loss: 2.8787 - reg_op_loss: 0.0115 - class_op_accuracy: 0.0629 - reg_op_IoU: 0.7045 - val_loss: 2.8607 - val_class_op_loss: 2.8475 - val_reg_op_loss: 0.0132 - val_class_op_accuracy: 0.0596 - val_reg_op_IoU: 0.7061 Epoch 5/20 509/509 [==============================] - 121s 237ms/step - loss: 2.8366 - class_op_loss: 2.8250 - reg_op_loss: 0.0116 - class_op_accuracy: 0.0634 - reg_op_IoU: 0.7045 - val_loss: 2.9005 - val_class_op_loss: 2.8875 - val_reg_op_loss: 0.0130 - val_class_op_accuracy: 0.0644 - val_reg_op_IoU: 0.6943 Epoch 6/20 509/509 [==============================] - 119s 234ms/step - loss: 2.8196 - class_op_loss: 2.8087 - reg_op_loss: 0.0109 - class_op_accuracy: 0.0625 - reg_op_IoU: 0.7049 - val_loss: 2.8364 - val_class_op_loss: 2.8244 - val_reg_op_loss: 0.0120 - val_class_op_accuracy: 0.0589 - val_reg_op_IoU: 0.7101 Epoch 7/20 509/509 [==============================] - 114s 223ms/step - loss: 2.8124 - class_op_loss: 2.8021 - reg_op_loss: 0.0103 - class_op_accuracy: 0.0557 - reg_op_IoU: 0.7150 - val_loss: 2.8059 - val_class_op_loss: 2.7965 - val_reg_op_loss: 0.0094 - val_class_op_accuracy: 0.0637 - val_reg_op_IoU: 0.7294 Epoch 8/20 509/509 [==============================] - 115s 226ms/step - loss: 2.8049 - class_op_loss: 2.7948 - reg_op_loss: 0.0100 - class_op_accuracy: 0.0635 - reg_op_IoU: 0.7185 - val_loss: 2.8372 - val_class_op_loss: 2.8248 - val_reg_op_loss: 0.0124 - val_class_op_accuracy: 0.0634 - val_reg_op_IoU: 0.6912 Epoch 9/20 509/509 [==============================] - 115s 226ms/step - loss: 2.8059 - class_op_loss: 2.7959 - reg_op_loss: 0.0100 - class_op_accuracy: 0.0589 - reg_op_IoU: 0.7162 - val_loss: 2.8009 - val_class_op_loss: 2.7923 - val_reg_op_loss: 0.0087 - val_class_op_accuracy: 0.0666 - val_reg_op_IoU: 0.7242 Epoch 10/20 509/509 [==============================] - 116s 228ms/step - loss: 2.8000 - class_op_loss: 2.7904 - reg_op_loss: 0.0097 - class_op_accuracy: 0.0565 - reg_op_IoU: 0.7232 - val_loss: 2.8053 - val_class_op_loss: 2.7958 - val_reg_op_loss: 0.0095 - val_class_op_accuracy: 0.0637 - val_reg_op_IoU: 0.7256 Epoch 11/20 509/509 [==============================] - 116s 228ms/step - loss: 2.7962 - class_op_loss: 2.7875 - reg_op_loss: 0.0087 - class_op_accuracy: 0.0635 - reg_op_IoU: 0.7332 - val_loss: 2.8882 - val_class_op_loss: 2.8592 - val_reg_op_loss: 0.0290 - val_class_op_accuracy: 0.0580 - val_reg_op_IoU: 0.5546 Epoch 12/20 509/509 [==============================] - 113s 223ms/step - loss: 2.7940 - class_op_loss: 2.7855 - reg_op_loss: 0.0085 - class_op_accuracy: 0.0613 - reg_op_IoU: 0.7319 - val_loss: 2.7933 - val_class_op_loss: 2.7861 - val_reg_op_loss: 0.0072 - val_class_op_accuracy: 0.0585 - val_reg_op_IoU: 0.7488 Epoch 13/20 509/509 [==============================] - 120s 235ms/step - loss: 2.7926 - class_op_loss: 2.7844 - reg_op_loss: 0.0082 - class_op_accuracy: 0.0582 - reg_op_IoU: 0.7341 - val_loss: 2.7929 - val_class_op_loss: 2.7861 - val_reg_op_loss: 0.0067 - val_class_op_accuracy: 0.0656 - val_reg_op_IoU: 0.7554 Epoch 14/20 509/509 [==============================] - 122s 240ms/step - loss: 2.7932 - class_op_loss: 2.7855 - reg_op_loss: 0.0077 - class_op_accuracy: 0.0614 - reg_op_IoU: 0.7386 - val_loss: 2.7913 - val_class_op_loss: 2.7855 - val_reg_op_loss: 0.0058 - val_class_op_accuracy: 0.0629 - val_reg_op_IoU: 0.7607 Epoch 15/20 509/509 [==============================] - 120s 236ms/step - loss: 2.7915 - class_op_loss: 2.7840 - reg_op_loss: 0.0075 - class_op_accuracy: 0.0593 - reg_op_IoU: 0.7416 - val_loss: 2.7865 - val_class_op_loss: 2.7804 - val_reg_op_loss: 0.0061 - val_class_op_accuracy: 0.0669 - val_reg_op_IoU: 0.7482 Epoch 16/20 509/509 [==============================] - 123s 241ms/step - loss: 2.7918 - class_op_loss: 2.7845 - reg_op_loss: 0.0073 - class_op_accuracy: 0.0626 - reg_op_IoU: 0.7411 - val_loss: 2.7907 - val_class_op_loss: 2.7836 - val_reg_op_loss: 0.0070 - val_class_op_accuracy: 0.0672 - val_reg_op_IoU: 0.7419 Epoch 17/20 509/509 [==============================] - 122s 239ms/step - loss: 2.7904 - class_op_loss: 2.7827 - reg_op_loss: 0.0077 - class_op_accuracy: 0.0630 - reg_op_IoU: 0.7395 - val_loss: 2.7896 - val_class_op_loss: 2.7831 - val_reg_op_loss: 0.0064 - val_class_op_accuracy: 0.0596 - val_reg_op_IoU: 0.7392 Epoch 18/20 509/509 [==============================] - 120s 236ms/step - loss: 2.7903 - class_op_loss: 2.7827 - reg_op_loss: 0.0076 - class_op_accuracy: 0.0652 - reg_op_IoU: 0.7383 - val_loss: 2.7879 - val_class_op_loss: 2.7811 - val_reg_op_loss: 0.0068 - val_class_op_accuracy: 0.0618 - val_reg_op_IoU: 0.7447 Epoch 19/20 509/509 [==============================] - 120s 236ms/step - loss: 2.7915 - class_op_loss: 2.7838 - reg_op_loss: 0.0078 - class_op_accuracy: 0.0589 - reg_op_IoU: 0.7392 - val_loss: 2.7922 - val_class_op_loss: 2.7851 - val_reg_op_loss: 0.0071 - val_class_op_accuracy: 0.0635 - val_reg_op_IoU: 0.7426 Epoch 20/20 509/509 [==============================] - 121s 238ms/step - loss: 2.7922 - class_op_loss: 2.7845 - reg_op_loss: 0.0077 - class_op_accuracy: 0.0647 - reg_op_IoU: 0.7388 - val_loss: 2.7854 - val_class_op_loss: 2.7790 - val_reg_op_loss: 0.0065 - val_class_op_accuracy: 0.0596 - val_reg_op_IoU: 0.7648
final_model.save('./mob_model_rcnn.h5')
final_model.save_weights('./mob_model_rcnn.h5')
acc = history_1.history['class_op_accuracy']
val_acc = history_1.history['val_class_op_accuracy']
iou = history_1.history['reg_op_IoU']
val_iou = history_1.history['val_reg_op_IoU']
epochs_range = range(20)
plt.figure(figsize=(16, 8))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, iou, label='Training IOU')
plt.plot(epochs_range, val_iou, label='Validation IOU')
plt.legend(loc='upper right')
plt.title('Training and Validation IOU')
plt.show()
Insights:
1) Training and Validation Accuracy:
Training Accuracy: The plot shows how accurately the model predicts the classes of objects in the training data as training progresses. Ideally, this should increase or plateau over epochs.
Validation Accuracy: This curve reflects how well the model generalizes to unseen data. It is crucial to ensure that validation accuracy increases or remains stable alongside training accuracy. A significant gap between training and validation accuracy may indicate overfitting.
2) Training and Validation IOU:
Training IOU: IOU measures the overlap between predicted bounding boxes and ground truth boxes. A higher IOU indicates better localization accuracy.
Validation IOU: Similar to validation accuracy, this metric evaluates the model's ability to generalize bounding box predictions to unseen data.
3) Overall Insights:
- Model Performance: Assess the consistency and convergence of both accuracy and IOU metrics to gauge the overall performance of the model.
- Overfitting: Watch out for signs of overfitting, such as a significant gap between training and validation metrics or a decrease in validation metrics after a certain point.
- Generalization: Ensure that the model generalizes well to unseen data by monitoring validation metrics alongside training metrics.
Model Prediction¶
def predict_and_draw(image_num, df):
#Load image
img = tf.keras.preprocessing.image.load_img(df.loc[image_num, 'path'])
w, h = img.size
#Prepare input for model
#1. Resize image
img_resized = img.resize((img_size, img_size))
#2. Conver to array and make it a batch of 1
input_array = tf.keras.preprocessing.image.img_to_array(img_resized)
input_array = np.expand_dims(input_array, axis=0)
#3. Normalize image data
input_array = tf.keras.applications.mobilenet.preprocess_input(input_array)
#Prediction
pred = final_model.predict(input_array)
pred_1 = MobileNet_model.predict(input_array)
#Get classification and regression predictions
label_pred, bbox_pred = pred_1, pred[1][0]
#Get Label with highest probability
pred_class = label_class_dict[np.argmax(label_pred)]
#Read actual label and bounding box
act_class = df.loc[image_num, 'Image class']
act_class = df.loc[image_num, 'label']
xmin, ymin, xmax, ymax = df.loc[image_num, ['x_min', 'y_min', 'x_max', 'y_max']]
print('Real Label :', act_class, '\nPredicted Label: ', pred_class)
#Draw bounding boxes - Actual (Red) and Predicted(Green)
img = cv2.imread(df.loc[image_num, 'path'])
#Draw actual bounding box - Red
img = cv2.rectangle(img, (xmin, ymin),
(xmax, ymax), (0,0,255), 3)
#Draw predicted bounding box - Green
img = cv2.rectangle(img, (int(bbox_pred[0]*w), int(bbox_pred[1]*h)),
(int((bbox_pred[0]+bbox_pred[2])*w), int((bbox_pred[1]+bbox_pred[3])*h)), (0,255,0), 3
)
#Display the picture
img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
plt.imshow(img)
plt.show()
#Predict on Test Dataset
label_class_dict = dict(zip(test_df['Image class'], test_df['label']))
for i in range(0,5):
image_num = np.random.randint(0, test_df.shape[0])
predict_and_draw(image_num, test_df)
1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step Real Label : Spyker C8 Convertible 2009 Predicted Label: Scion xD Hatchback 2012
1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step Real Label : Jeep Liberty SUV 2012 Predicted Label: Jeep Liberty SUV 2012
1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step Real Label : Acura Integra Type R 2001 Predicted Label: AM General Hummer SUV 2000
1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step Real Label : Bugatti Veyron 16.4 Convertible 2009 Predicted Label: Bentley Continental Flying Spur Sedan 2007
1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step Real Label : Mercedes-Benz SL-Class Coupe 2009 Predicted Label: Mercedes-Benz E-Class Sedan 2012
#Predict on Test Dataset
label_class_dict = dict(zip(test_df['Image class'], test_df['label']))
for i in range(0,5):
image_num = np.random.randint(0, test_df.shape[0])
predict_and_draw(image_num, test_df)
1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 27ms/step Real Label : Chevrolet Silverado 2500HD Regular Cab 2012 Predicted Label: Chevrolet Silverado 1500 Classic Extended Cab 2007
1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step Real Label : Audi S5 Coupe 2012 Predicted Label: Audi A5 Coupe 2012
1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 28ms/step Real Label : Chevrolet Traverse SUV 2012 Predicted Label: Hyundai Elantra Touring Hatchback 2012
1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step Real Label : Volvo C30 Hatchback 2012 Predicted Label: Volvo C30 Hatchback 2012
1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step Real Label : Dodge Charger Sedan 2012 Predicted Label: Dodge Caravan Minivan 1997
test_df.columns
Index(['id', 'image', 'label', 'dataset', 'height', 'width', 'n_channels',
'path', 'source', 'x_min', 'y_min', 'x_max', 'y_max', 'Image class',
'model_year', 'label_encode'],
dtype='object')
testData_generator = batch_generator(test_df, batch_size=8, model_used='mobilenet')
print('Evaluating')
# Re-evaluate the model
test_results = final_model.evaluate(testData_generator, verbose=1, batch_size = 16, steps = 509, return_dict = True)
acc = test_results['reg_op_IoU']
print("Restored model, IOU: {:5.2f}%".format(100 * acc))
Evaluating 509/509 [==============================] - 34s 67ms/step - loss: 2.7859 - class_op_loss: 2.7793 - reg_op_loss: 0.0066 - class_op_accuracy: 0.0444 - reg_op_IoU: 0.7623 Restored model, IOU: 76.23%
print(test_results)
{'loss': 2.7852208614349365, 'class_op_loss': 2.778623104095459, 'reg_op_loss': 0.006596399936825037, 'class_op_accuracy': 0.04076620936393738, 'reg_op_IoU': 0.7596479058265686}
Efficient Net with Regression¶
# Efficient Net with Faster RCNN
from tensorflow.keras.applications.efficientnet import EfficientNetB0
model_2 = EfficientNetB0(input_shape=(224, 224, 3), include_top=False, weights='imagenet')
#Freeze all the layers
for layer in model.layers:
layer.trainable = True
#get Output layer of Pre-trained model
x1 = model_2.output
#Pooling layer
x2 = GlobalAveragePooling2D()(x1)
#Flatten the output to feed to Dense layer
x3 = Dropout(0.2)(x2)
#Add one Dense layer
x4_1 = Dense(1024, activation='relu')(x3)
x4_2 = Dropout(0.15)(x4_1)
#Add one Dense layer
x4_3 = Dense(512, activation='relu')(x4_2)
x4_4 = Dropout(0.05)(x4_3)
#Add one Dense layer
x4 = Dense(256, activation='relu')(x4_4)
#Batch Norm
x5 = BatchNormalization()(x4)
#Print the number of classes
print(num_class)
#Classification
label_output = Dense(num_class, activation='softmax', name='class_op')(x5)
196
model_2.summary()
Model: "efficientnetb0"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_2[0][0]']
normalization (Normalization) (None, 224, 224, 3) 7 ['rescaling[0][0]']
rescaling_1 (Rescaling) (None, 224, 224, 3) 0 ['normalization[0][0]']
stem_conv_pad (ZeroPadding2D) (None, 225, 225, 3) 0 ['rescaling_1[0][0]']
stem_conv (Conv2D) (None, 112, 112, 32 864 ['stem_conv_pad[0][0]']
)
stem_bn (BatchNormalization) (None, 112, 112, 32 128 ['stem_conv[0][0]']
)
stem_activation (Activation) (None, 112, 112, 32 0 ['stem_bn[0][0]']
)
block1a_dwconv (DepthwiseConv2 (None, 112, 112, 32 288 ['stem_activation[0][0]']
D) )
block1a_bn (BatchNormalization (None, 112, 112, 32 128 ['block1a_dwconv[0][0]']
) )
block1a_activation (Activation (None, 112, 112, 32 0 ['block1a_bn[0][0]']
) )
block1a_se_squeeze (GlobalAver (None, 32) 0 ['block1a_activation[0][0]']
agePooling2D)
block1a_se_reshape (Reshape) (None, 1, 1, 32) 0 ['block1a_se_squeeze[0][0]']
block1a_se_reduce (Conv2D) (None, 1, 1, 8) 264 ['block1a_se_reshape[0][0]']
block1a_se_expand (Conv2D) (None, 1, 1, 32) 288 ['block1a_se_reduce[0][0]']
block1a_se_excite (Multiply) (None, 112, 112, 32 0 ['block1a_activation[0][0]',
) 'block1a_se_expand[0][0]']
block1a_project_conv (Conv2D) (None, 112, 112, 16 512 ['block1a_se_excite[0][0]']
)
block1a_project_bn (BatchNorma (None, 112, 112, 16 64 ['block1a_project_conv[0][0]']
lization) )
block2a_expand_conv (Conv2D) (None, 112, 112, 96 1536 ['block1a_project_bn[0][0]']
)
block2a_expand_bn (BatchNormal (None, 112, 112, 96 384 ['block2a_expand_conv[0][0]']
ization) )
block2a_expand_activation (Act (None, 112, 112, 96 0 ['block2a_expand_bn[0][0]']
ivation) )
block2a_dwconv_pad (ZeroPaddin (None, 113, 113, 96 0 ['block2a_expand_activation[0][0]
g2D) ) ']
block2a_dwconv (DepthwiseConv2 (None, 56, 56, 96) 864 ['block2a_dwconv_pad[0][0]']
D)
block2a_bn (BatchNormalization (None, 56, 56, 96) 384 ['block2a_dwconv[0][0]']
)
block2a_activation (Activation (None, 56, 56, 96) 0 ['block2a_bn[0][0]']
)
block2a_se_squeeze (GlobalAver (None, 96) 0 ['block2a_activation[0][0]']
agePooling2D)
block2a_se_reshape (Reshape) (None, 1, 1, 96) 0 ['block2a_se_squeeze[0][0]']
block2a_se_reduce (Conv2D) (None, 1, 1, 4) 388 ['block2a_se_reshape[0][0]']
block2a_se_expand (Conv2D) (None, 1, 1, 96) 480 ['block2a_se_reduce[0][0]']
block2a_se_excite (Multiply) (None, 56, 56, 96) 0 ['block2a_activation[0][0]',
'block2a_se_expand[0][0]']
block2a_project_conv (Conv2D) (None, 56, 56, 24) 2304 ['block2a_se_excite[0][0]']
block2a_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2a_project_conv[0][0]']
lization)
block2b_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2a_project_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block2b_expand_conv[0][0]']
ization)
block2b_expand_activation (Act (None, 56, 56, 144) 0 ['block2b_expand_bn[0][0]']
ivation)
block2b_dwconv (DepthwiseConv2 (None, 56, 56, 144) 1296 ['block2b_expand_activation[0][0]
D) ']
block2b_bn (BatchNormalization (None, 56, 56, 144) 576 ['block2b_dwconv[0][0]']
)
block2b_activation (Activation (None, 56, 56, 144) 0 ['block2b_bn[0][0]']
)
block2b_se_squeeze (GlobalAver (None, 144) 0 ['block2b_activation[0][0]']
agePooling2D)
block2b_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block2b_se_squeeze[0][0]']
block2b_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block2b_se_reshape[0][0]']
block2b_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block2b_se_reduce[0][0]']
block2b_se_excite (Multiply) (None, 56, 56, 144) 0 ['block2b_activation[0][0]',
'block2b_se_expand[0][0]']
block2b_project_conv (Conv2D) (None, 56, 56, 24) 3456 ['block2b_se_excite[0][0]']
block2b_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2b_project_conv[0][0]']
lization)
block2b_drop (Dropout) (None, 56, 56, 24) 0 ['block2b_project_bn[0][0]']
block2b_add (Add) (None, 56, 56, 24) 0 ['block2b_drop[0][0]',
'block2a_project_bn[0][0]']
block3a_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2b_add[0][0]']
block3a_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block3a_expand_conv[0][0]']
ization)
block3a_expand_activation (Act (None, 56, 56, 144) 0 ['block3a_expand_bn[0][0]']
ivation)
block3a_dwconv_pad (ZeroPaddin (None, 59, 59, 144) 0 ['block3a_expand_activation[0][0]
g2D) ']
block3a_dwconv (DepthwiseConv2 (None, 28, 28, 144) 3600 ['block3a_dwconv_pad[0][0]']
D)
block3a_bn (BatchNormalization (None, 28, 28, 144) 576 ['block3a_dwconv[0][0]']
)
block3a_activation (Activation (None, 28, 28, 144) 0 ['block3a_bn[0][0]']
)
block3a_se_squeeze (GlobalAver (None, 144) 0 ['block3a_activation[0][0]']
agePooling2D)
block3a_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block3a_se_squeeze[0][0]']
block3a_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block3a_se_reshape[0][0]']
block3a_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block3a_se_reduce[0][0]']
block3a_se_excite (Multiply) (None, 28, 28, 144) 0 ['block3a_activation[0][0]',
'block3a_se_expand[0][0]']
block3a_project_conv (Conv2D) (None, 28, 28, 40) 5760 ['block3a_se_excite[0][0]']
block3a_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3a_project_conv[0][0]']
lization)
block3b_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3a_project_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block3b_expand_conv[0][0]']
ization)
block3b_expand_activation (Act (None, 28, 28, 240) 0 ['block3b_expand_bn[0][0]']
ivation)
block3b_dwconv (DepthwiseConv2 (None, 28, 28, 240) 6000 ['block3b_expand_activation[0][0]
D) ']
block3b_bn (BatchNormalization (None, 28, 28, 240) 960 ['block3b_dwconv[0][0]']
)
block3b_activation (Activation (None, 28, 28, 240) 0 ['block3b_bn[0][0]']
)
block3b_se_squeeze (GlobalAver (None, 240) 0 ['block3b_activation[0][0]']
agePooling2D)
block3b_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block3b_se_squeeze[0][0]']
block3b_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block3b_se_reshape[0][0]']
block3b_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block3b_se_reduce[0][0]']
block3b_se_excite (Multiply) (None, 28, 28, 240) 0 ['block3b_activation[0][0]',
'block3b_se_expand[0][0]']
block3b_project_conv (Conv2D) (None, 28, 28, 40) 9600 ['block3b_se_excite[0][0]']
block3b_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3b_project_conv[0][0]']
lization)
block3b_drop (Dropout) (None, 28, 28, 40) 0 ['block3b_project_bn[0][0]']
block3b_add (Add) (None, 28, 28, 40) 0 ['block3b_drop[0][0]',
'block3a_project_bn[0][0]']
block4a_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3b_add[0][0]']
block4a_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block4a_expand_conv[0][0]']
ization)
block4a_expand_activation (Act (None, 28, 28, 240) 0 ['block4a_expand_bn[0][0]']
ivation)
block4a_dwconv_pad (ZeroPaddin (None, 29, 29, 240) 0 ['block4a_expand_activation[0][0]
g2D) ']
block4a_dwconv (DepthwiseConv2 (None, 14, 14, 240) 2160 ['block4a_dwconv_pad[0][0]']
D)
block4a_bn (BatchNormalization (None, 14, 14, 240) 960 ['block4a_dwconv[0][0]']
)
block4a_activation (Activation (None, 14, 14, 240) 0 ['block4a_bn[0][0]']
)
block4a_se_squeeze (GlobalAver (None, 240) 0 ['block4a_activation[0][0]']
agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block4a_se_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block4a_se_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block4a_se_reduce[0][0]']
block4a_se_excite (Multiply) (None, 14, 14, 240) 0 ['block4a_activation[0][0]',
'block4a_se_expand[0][0]']
block4a_project_conv (Conv2D) (None, 14, 14, 80) 19200 ['block4a_se_excite[0][0]']
block4a_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4a_project_conv[0][0]']
lization)
block4b_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4a_project_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4b_expand_conv[0][0]']
ization)
block4b_expand_activation (Act (None, 14, 14, 480) 0 ['block4b_expand_bn[0][0]']
ivation)
block4b_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4b_expand_activation[0][0]
D) ']
block4b_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4b_dwconv[0][0]']
)
block4b_activation (Activation (None, 14, 14, 480) 0 ['block4b_bn[0][0]']
)
block4b_se_squeeze (GlobalAver (None, 480) 0 ['block4b_activation[0][0]']
agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4b_se_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4b_se_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4b_se_reduce[0][0]']
block4b_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4b_activation[0][0]',
'block4b_se_expand[0][0]']
block4b_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4b_se_excite[0][0]']
block4b_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4b_project_conv[0][0]']
lization)
block4b_drop (Dropout) (None, 14, 14, 80) 0 ['block4b_project_bn[0][0]']
block4b_add (Add) (None, 14, 14, 80) 0 ['block4b_drop[0][0]',
'block4a_project_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4b_add[0][0]']
block4c_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4c_expand_conv[0][0]']
ization)
block4c_expand_activation (Act (None, 14, 14, 480) 0 ['block4c_expand_bn[0][0]']
ivation)
block4c_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4c_expand_activation[0][0]
D) ']
block4c_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4c_dwconv[0][0]']
)
block4c_activation (Activation (None, 14, 14, 480) 0 ['block4c_bn[0][0]']
)
block4c_se_squeeze (GlobalAver (None, 480) 0 ['block4c_activation[0][0]']
agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4c_se_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4c_se_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4c_se_reduce[0][0]']
block4c_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4c_activation[0][0]',
'block4c_se_expand[0][0]']
block4c_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4c_se_excite[0][0]']
block4c_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4c_project_conv[0][0]']
lization)
block4c_drop (Dropout) (None, 14, 14, 80) 0 ['block4c_project_bn[0][0]']
block4c_add (Add) (None, 14, 14, 80) 0 ['block4c_drop[0][0]',
'block4b_add[0][0]']
block5a_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4c_add[0][0]']
block5a_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block5a_expand_conv[0][0]']
ization)
block5a_expand_activation (Act (None, 14, 14, 480) 0 ['block5a_expand_bn[0][0]']
ivation)
block5a_dwconv (DepthwiseConv2 (None, 14, 14, 480) 12000 ['block5a_expand_activation[0][0]
D) ']
block5a_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block5a_dwconv[0][0]']
)
block5a_activation (Activation (None, 14, 14, 480) 0 ['block5a_bn[0][0]']
)
block5a_se_squeeze (GlobalAver (None, 480) 0 ['block5a_activation[0][0]']
agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block5a_se_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block5a_se_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block5a_se_reduce[0][0]']
block5a_se_excite (Multiply) (None, 14, 14, 480) 0 ['block5a_activation[0][0]',
'block5a_se_expand[0][0]']
block5a_project_conv (Conv2D) (None, 14, 14, 112) 53760 ['block5a_se_excite[0][0]']
block5a_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5a_project_conv[0][0]']
lization)
block5b_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5a_project_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5b_expand_conv[0][0]']
ization)
block5b_expand_activation (Act (None, 14, 14, 672) 0 ['block5b_expand_bn[0][0]']
ivation)
block5b_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5b_expand_activation[0][0]
D) ']
block5b_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5b_dwconv[0][0]']
)
block5b_activation (Activation (None, 14, 14, 672) 0 ['block5b_bn[0][0]']
)
block5b_se_squeeze (GlobalAver (None, 672) 0 ['block5b_activation[0][0]']
agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5b_se_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5b_se_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5b_se_reduce[0][0]']
block5b_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5b_activation[0][0]',
'block5b_se_expand[0][0]']
block5b_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5b_se_excite[0][0]']
block5b_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5b_project_conv[0][0]']
lization)
block5b_drop (Dropout) (None, 14, 14, 112) 0 ['block5b_project_bn[0][0]']
block5b_add (Add) (None, 14, 14, 112) 0 ['block5b_drop[0][0]',
'block5a_project_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5b_add[0][0]']
block5c_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5c_expand_conv[0][0]']
ization)
block5c_expand_activation (Act (None, 14, 14, 672) 0 ['block5c_expand_bn[0][0]']
ivation)
block5c_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5c_expand_activation[0][0]
D) ']
block5c_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5c_dwconv[0][0]']
)
block5c_activation (Activation (None, 14, 14, 672) 0 ['block5c_bn[0][0]']
)
block5c_se_squeeze (GlobalAver (None, 672) 0 ['block5c_activation[0][0]']
agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5c_se_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5c_se_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5c_se_reduce[0][0]']
block5c_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5c_activation[0][0]',
'block5c_se_expand[0][0]']
block5c_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5c_se_excite[0][0]']
block5c_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5c_project_conv[0][0]']
lization)
block5c_drop (Dropout) (None, 14, 14, 112) 0 ['block5c_project_bn[0][0]']
block5c_add (Add) (None, 14, 14, 112) 0 ['block5c_drop[0][0]',
'block5b_add[0][0]']
block6a_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5c_add[0][0]']
block6a_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block6a_expand_conv[0][0]']
ization)
block6a_expand_activation (Act (None, 14, 14, 672) 0 ['block6a_expand_bn[0][0]']
ivation)
block6a_dwconv_pad (ZeroPaddin (None, 17, 17, 672) 0 ['block6a_expand_activation[0][0]
g2D) ']
block6a_dwconv (DepthwiseConv2 (None, 7, 7, 672) 16800 ['block6a_dwconv_pad[0][0]']
D)
block6a_bn (BatchNormalization (None, 7, 7, 672) 2688 ['block6a_dwconv[0][0]']
)
block6a_activation (Activation (None, 7, 7, 672) 0 ['block6a_bn[0][0]']
)
block6a_se_squeeze (GlobalAver (None, 672) 0 ['block6a_activation[0][0]']
agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block6a_se_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block6a_se_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block6a_se_reduce[0][0]']
block6a_se_excite (Multiply) (None, 7, 7, 672) 0 ['block6a_activation[0][0]',
'block6a_se_expand[0][0]']
block6a_project_conv (Conv2D) (None, 7, 7, 192) 129024 ['block6a_se_excite[0][0]']
block6a_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6a_project_conv[0][0]']
lization)
block6b_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6a_project_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6b_expand_conv[0][0]']
ization)
block6b_expand_activation (Act (None, 7, 7, 1152) 0 ['block6b_expand_bn[0][0]']
ivation)
block6b_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6b_expand_activation[0][0]
D) ']
block6b_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6b_dwconv[0][0]']
)
block6b_activation (Activation (None, 7, 7, 1152) 0 ['block6b_bn[0][0]']
)
block6b_se_squeeze (GlobalAver (None, 1152) 0 ['block6b_activation[0][0]']
agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6b_se_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6b_se_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6b_se_reduce[0][0]']
block6b_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6b_activation[0][0]',
'block6b_se_expand[0][0]']
block6b_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6b_se_excite[0][0]']
block6b_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6b_project_conv[0][0]']
lization)
block6b_drop (Dropout) (None, 7, 7, 192) 0 ['block6b_project_bn[0][0]']
block6b_add (Add) (None, 7, 7, 192) 0 ['block6b_drop[0][0]',
'block6a_project_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6b_add[0][0]']
block6c_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6c_expand_conv[0][0]']
ization)
block6c_expand_activation (Act (None, 7, 7, 1152) 0 ['block6c_expand_bn[0][0]']
ivation)
block6c_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6c_expand_activation[0][0]
D) ']
block6c_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6c_dwconv[0][0]']
)
block6c_activation (Activation (None, 7, 7, 1152) 0 ['block6c_bn[0][0]']
)
block6c_se_squeeze (GlobalAver (None, 1152) 0 ['block6c_activation[0][0]']
agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6c_se_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6c_se_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6c_se_reduce[0][0]']
block6c_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6c_activation[0][0]',
'block6c_se_expand[0][0]']
block6c_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6c_se_excite[0][0]']
block6c_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6c_project_conv[0][0]']
lization)
block6c_drop (Dropout) (None, 7, 7, 192) 0 ['block6c_project_bn[0][0]']
block6c_add (Add) (None, 7, 7, 192) 0 ['block6c_drop[0][0]',
'block6b_add[0][0]']
block6d_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6c_add[0][0]']
block6d_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6d_expand_conv[0][0]']
ization)
block6d_expand_activation (Act (None, 7, 7, 1152) 0 ['block6d_expand_bn[0][0]']
ivation)
block6d_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6d_expand_activation[0][0]
D) ']
block6d_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6d_dwconv[0][0]']
)
block6d_activation (Activation (None, 7, 7, 1152) 0 ['block6d_bn[0][0]']
)
block6d_se_squeeze (GlobalAver (None, 1152) 0 ['block6d_activation[0][0]']
agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6d_se_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6d_se_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6d_se_reduce[0][0]']
block6d_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6d_activation[0][0]',
'block6d_se_expand[0][0]']
block6d_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6d_se_excite[0][0]']
block6d_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6d_project_conv[0][0]']
lization)
block6d_drop (Dropout) (None, 7, 7, 192) 0 ['block6d_project_bn[0][0]']
block6d_add (Add) (None, 7, 7, 192) 0 ['block6d_drop[0][0]',
'block6c_add[0][0]']
block7a_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6d_add[0][0]']
block7a_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block7a_expand_conv[0][0]']
ization)
block7a_expand_activation (Act (None, 7, 7, 1152) 0 ['block7a_expand_bn[0][0]']
ivation)
block7a_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 10368 ['block7a_expand_activation[0][0]
D) ']
block7a_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block7a_dwconv[0][0]']
)
block7a_activation (Activation (None, 7, 7, 1152) 0 ['block7a_bn[0][0]']
)
block7a_se_squeeze (GlobalAver (None, 1152) 0 ['block7a_activation[0][0]']
agePooling2D)
block7a_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block7a_se_squeeze[0][0]']
block7a_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block7a_se_reshape[0][0]']
block7a_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block7a_se_reduce[0][0]']
block7a_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block7a_activation[0][0]',
'block7a_se_expand[0][0]']
block7a_project_conv (Conv2D) (None, 7, 7, 320) 368640 ['block7a_se_excite[0][0]']
block7a_project_bn (BatchNorma (None, 7, 7, 320) 1280 ['block7a_project_conv[0][0]']
lization)
top_conv (Conv2D) (None, 7, 7, 1280) 409600 ['block7a_project_bn[0][0]']
top_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['top_conv[0][0]']
top_activation (Activation) (None, 7, 7, 1280) 0 ['top_bn[0][0]']
==================================================================================================
Total params: 4,049,571
Trainable params: 4,007,548
Non-trainable params: 42,023
__________________________________________________________________________________________________
len(model_2.layers)
238
model_2.output
<KerasTensor: shape=(None, 7, 7, 1280) dtype=float32 (created by layer 'top_activation')>
label_output
<KerasTensor: shape=(None, 196) dtype=float32 (created by layer 'class_op')>
#Regression
bbox_output = Dense(4, activation='sigmoid', name='reg_op')(x3)
bbox_output
<KerasTensor: shape=(None, 4) dtype=float32 (created by layer 'reg_op')>
#Non Sequential model as it has two different outputs
final_model_2 = Model(inputs=model_2.input, #Pre-trained model input as input layer
outputs=[label_output,bbox_output]) #Output layer added
final_model_2.summary()
Model: "model_3"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_2[0][0]']
normalization (Normalization) (None, 224, 224, 3) 7 ['rescaling[0][0]']
rescaling_1 (Rescaling) (None, 224, 224, 3) 0 ['normalization[0][0]']
stem_conv_pad (ZeroPadding2D) (None, 225, 225, 3) 0 ['rescaling_1[0][0]']
stem_conv (Conv2D) (None, 112, 112, 32 864 ['stem_conv_pad[0][0]']
)
stem_bn (BatchNormalization) (None, 112, 112, 32 128 ['stem_conv[0][0]']
)
stem_activation (Activation) (None, 112, 112, 32 0 ['stem_bn[0][0]']
)
block1a_dwconv (DepthwiseConv2 (None, 112, 112, 32 288 ['stem_activation[0][0]']
D) )
block1a_bn (BatchNormalization (None, 112, 112, 32 128 ['block1a_dwconv[0][0]']
) )
block1a_activation (Activation (None, 112, 112, 32 0 ['block1a_bn[0][0]']
) )
block1a_se_squeeze (GlobalAver (None, 32) 0 ['block1a_activation[0][0]']
agePooling2D)
block1a_se_reshape (Reshape) (None, 1, 1, 32) 0 ['block1a_se_squeeze[0][0]']
block1a_se_reduce (Conv2D) (None, 1, 1, 8) 264 ['block1a_se_reshape[0][0]']
block1a_se_expand (Conv2D) (None, 1, 1, 32) 288 ['block1a_se_reduce[0][0]']
block1a_se_excite (Multiply) (None, 112, 112, 32 0 ['block1a_activation[0][0]',
) 'block1a_se_expand[0][0]']
block1a_project_conv (Conv2D) (None, 112, 112, 16 512 ['block1a_se_excite[0][0]']
)
block1a_project_bn (BatchNorma (None, 112, 112, 16 64 ['block1a_project_conv[0][0]']
lization) )
block2a_expand_conv (Conv2D) (None, 112, 112, 96 1536 ['block1a_project_bn[0][0]']
)
block2a_expand_bn (BatchNormal (None, 112, 112, 96 384 ['block2a_expand_conv[0][0]']
ization) )
block2a_expand_activation (Act (None, 112, 112, 96 0 ['block2a_expand_bn[0][0]']
ivation) )
block2a_dwconv_pad (ZeroPaddin (None, 113, 113, 96 0 ['block2a_expand_activation[0][0]
g2D) ) ']
block2a_dwconv (DepthwiseConv2 (None, 56, 56, 96) 864 ['block2a_dwconv_pad[0][0]']
D)
block2a_bn (BatchNormalization (None, 56, 56, 96) 384 ['block2a_dwconv[0][0]']
)
block2a_activation (Activation (None, 56, 56, 96) 0 ['block2a_bn[0][0]']
)
block2a_se_squeeze (GlobalAver (None, 96) 0 ['block2a_activation[0][0]']
agePooling2D)
block2a_se_reshape (Reshape) (None, 1, 1, 96) 0 ['block2a_se_squeeze[0][0]']
block2a_se_reduce (Conv2D) (None, 1, 1, 4) 388 ['block2a_se_reshape[0][0]']
block2a_se_expand (Conv2D) (None, 1, 1, 96) 480 ['block2a_se_reduce[0][0]']
block2a_se_excite (Multiply) (None, 56, 56, 96) 0 ['block2a_activation[0][0]',
'block2a_se_expand[0][0]']
block2a_project_conv (Conv2D) (None, 56, 56, 24) 2304 ['block2a_se_excite[0][0]']
block2a_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2a_project_conv[0][0]']
lization)
block2b_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2a_project_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block2b_expand_conv[0][0]']
ization)
block2b_expand_activation (Act (None, 56, 56, 144) 0 ['block2b_expand_bn[0][0]']
ivation)
block2b_dwconv (DepthwiseConv2 (None, 56, 56, 144) 1296 ['block2b_expand_activation[0][0]
D) ']
block2b_bn (BatchNormalization (None, 56, 56, 144) 576 ['block2b_dwconv[0][0]']
)
block2b_activation (Activation (None, 56, 56, 144) 0 ['block2b_bn[0][0]']
)
block2b_se_squeeze (GlobalAver (None, 144) 0 ['block2b_activation[0][0]']
agePooling2D)
block2b_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block2b_se_squeeze[0][0]']
block2b_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block2b_se_reshape[0][0]']
block2b_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block2b_se_reduce[0][0]']
block2b_se_excite (Multiply) (None, 56, 56, 144) 0 ['block2b_activation[0][0]',
'block2b_se_expand[0][0]']
block2b_project_conv (Conv2D) (None, 56, 56, 24) 3456 ['block2b_se_excite[0][0]']
block2b_project_bn (BatchNorma (None, 56, 56, 24) 96 ['block2b_project_conv[0][0]']
lization)
block2b_drop (Dropout) (None, 56, 56, 24) 0 ['block2b_project_bn[0][0]']
block2b_add (Add) (None, 56, 56, 24) 0 ['block2b_drop[0][0]',
'block2a_project_bn[0][0]']
block3a_expand_conv (Conv2D) (None, 56, 56, 144) 3456 ['block2b_add[0][0]']
block3a_expand_bn (BatchNormal (None, 56, 56, 144) 576 ['block3a_expand_conv[0][0]']
ization)
block3a_expand_activation (Act (None, 56, 56, 144) 0 ['block3a_expand_bn[0][0]']
ivation)
block3a_dwconv_pad (ZeroPaddin (None, 59, 59, 144) 0 ['block3a_expand_activation[0][0]
g2D) ']
block3a_dwconv (DepthwiseConv2 (None, 28, 28, 144) 3600 ['block3a_dwconv_pad[0][0]']
D)
block3a_bn (BatchNormalization (None, 28, 28, 144) 576 ['block3a_dwconv[0][0]']
)
block3a_activation (Activation (None, 28, 28, 144) 0 ['block3a_bn[0][0]']
)
block3a_se_squeeze (GlobalAver (None, 144) 0 ['block3a_activation[0][0]']
agePooling2D)
block3a_se_reshape (Reshape) (None, 1, 1, 144) 0 ['block3a_se_squeeze[0][0]']
block3a_se_reduce (Conv2D) (None, 1, 1, 6) 870 ['block3a_se_reshape[0][0]']
block3a_se_expand (Conv2D) (None, 1, 1, 144) 1008 ['block3a_se_reduce[0][0]']
block3a_se_excite (Multiply) (None, 28, 28, 144) 0 ['block3a_activation[0][0]',
'block3a_se_expand[0][0]']
block3a_project_conv (Conv2D) (None, 28, 28, 40) 5760 ['block3a_se_excite[0][0]']
block3a_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3a_project_conv[0][0]']
lization)
block3b_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3a_project_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block3b_expand_conv[0][0]']
ization)
block3b_expand_activation (Act (None, 28, 28, 240) 0 ['block3b_expand_bn[0][0]']
ivation)
block3b_dwconv (DepthwiseConv2 (None, 28, 28, 240) 6000 ['block3b_expand_activation[0][0]
D) ']
block3b_bn (BatchNormalization (None, 28, 28, 240) 960 ['block3b_dwconv[0][0]']
)
block3b_activation (Activation (None, 28, 28, 240) 0 ['block3b_bn[0][0]']
)
block3b_se_squeeze (GlobalAver (None, 240) 0 ['block3b_activation[0][0]']
agePooling2D)
block3b_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block3b_se_squeeze[0][0]']
block3b_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block3b_se_reshape[0][0]']
block3b_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block3b_se_reduce[0][0]']
block3b_se_excite (Multiply) (None, 28, 28, 240) 0 ['block3b_activation[0][0]',
'block3b_se_expand[0][0]']
block3b_project_conv (Conv2D) (None, 28, 28, 40) 9600 ['block3b_se_excite[0][0]']
block3b_project_bn (BatchNorma (None, 28, 28, 40) 160 ['block3b_project_conv[0][0]']
lization)
block3b_drop (Dropout) (None, 28, 28, 40) 0 ['block3b_project_bn[0][0]']
block3b_add (Add) (None, 28, 28, 40) 0 ['block3b_drop[0][0]',
'block3a_project_bn[0][0]']
block4a_expand_conv (Conv2D) (None, 28, 28, 240) 9600 ['block3b_add[0][0]']
block4a_expand_bn (BatchNormal (None, 28, 28, 240) 960 ['block4a_expand_conv[0][0]']
ization)
block4a_expand_activation (Act (None, 28, 28, 240) 0 ['block4a_expand_bn[0][0]']
ivation)
block4a_dwconv_pad (ZeroPaddin (None, 29, 29, 240) 0 ['block4a_expand_activation[0][0]
g2D) ']
block4a_dwconv (DepthwiseConv2 (None, 14, 14, 240) 2160 ['block4a_dwconv_pad[0][0]']
D)
block4a_bn (BatchNormalization (None, 14, 14, 240) 960 ['block4a_dwconv[0][0]']
)
block4a_activation (Activation (None, 14, 14, 240) 0 ['block4a_bn[0][0]']
)
block4a_se_squeeze (GlobalAver (None, 240) 0 ['block4a_activation[0][0]']
agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 240) 0 ['block4a_se_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 10) 2410 ['block4a_se_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 240) 2640 ['block4a_se_reduce[0][0]']
block4a_se_excite (Multiply) (None, 14, 14, 240) 0 ['block4a_activation[0][0]',
'block4a_se_expand[0][0]']
block4a_project_conv (Conv2D) (None, 14, 14, 80) 19200 ['block4a_se_excite[0][0]']
block4a_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4a_project_conv[0][0]']
lization)
block4b_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4a_project_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4b_expand_conv[0][0]']
ization)
block4b_expand_activation (Act (None, 14, 14, 480) 0 ['block4b_expand_bn[0][0]']
ivation)
block4b_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4b_expand_activation[0][0]
D) ']
block4b_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4b_dwconv[0][0]']
)
block4b_activation (Activation (None, 14, 14, 480) 0 ['block4b_bn[0][0]']
)
block4b_se_squeeze (GlobalAver (None, 480) 0 ['block4b_activation[0][0]']
agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4b_se_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4b_se_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4b_se_reduce[0][0]']
block4b_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4b_activation[0][0]',
'block4b_se_expand[0][0]']
block4b_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4b_se_excite[0][0]']
block4b_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4b_project_conv[0][0]']
lization)
block4b_drop (Dropout) (None, 14, 14, 80) 0 ['block4b_project_bn[0][0]']
block4b_add (Add) (None, 14, 14, 80) 0 ['block4b_drop[0][0]',
'block4a_project_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4b_add[0][0]']
block4c_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block4c_expand_conv[0][0]']
ization)
block4c_expand_activation (Act (None, 14, 14, 480) 0 ['block4c_expand_bn[0][0]']
ivation)
block4c_dwconv (DepthwiseConv2 (None, 14, 14, 480) 4320 ['block4c_expand_activation[0][0]
D) ']
block4c_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block4c_dwconv[0][0]']
)
block4c_activation (Activation (None, 14, 14, 480) 0 ['block4c_bn[0][0]']
)
block4c_se_squeeze (GlobalAver (None, 480) 0 ['block4c_activation[0][0]']
agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block4c_se_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block4c_se_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block4c_se_reduce[0][0]']
block4c_se_excite (Multiply) (None, 14, 14, 480) 0 ['block4c_activation[0][0]',
'block4c_se_expand[0][0]']
block4c_project_conv (Conv2D) (None, 14, 14, 80) 38400 ['block4c_se_excite[0][0]']
block4c_project_bn (BatchNorma (None, 14, 14, 80) 320 ['block4c_project_conv[0][0]']
lization)
block4c_drop (Dropout) (None, 14, 14, 80) 0 ['block4c_project_bn[0][0]']
block4c_add (Add) (None, 14, 14, 80) 0 ['block4c_drop[0][0]',
'block4b_add[0][0]']
block5a_expand_conv (Conv2D) (None, 14, 14, 480) 38400 ['block4c_add[0][0]']
block5a_expand_bn (BatchNormal (None, 14, 14, 480) 1920 ['block5a_expand_conv[0][0]']
ization)
block5a_expand_activation (Act (None, 14, 14, 480) 0 ['block5a_expand_bn[0][0]']
ivation)
block5a_dwconv (DepthwiseConv2 (None, 14, 14, 480) 12000 ['block5a_expand_activation[0][0]
D) ']
block5a_bn (BatchNormalization (None, 14, 14, 480) 1920 ['block5a_dwconv[0][0]']
)
block5a_activation (Activation (None, 14, 14, 480) 0 ['block5a_bn[0][0]']
)
block5a_se_squeeze (GlobalAver (None, 480) 0 ['block5a_activation[0][0]']
agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 480) 0 ['block5a_se_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 20) 9620 ['block5a_se_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 480) 10080 ['block5a_se_reduce[0][0]']
block5a_se_excite (Multiply) (None, 14, 14, 480) 0 ['block5a_activation[0][0]',
'block5a_se_expand[0][0]']
block5a_project_conv (Conv2D) (None, 14, 14, 112) 53760 ['block5a_se_excite[0][0]']
block5a_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5a_project_conv[0][0]']
lization)
block5b_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5a_project_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5b_expand_conv[0][0]']
ization)
block5b_expand_activation (Act (None, 14, 14, 672) 0 ['block5b_expand_bn[0][0]']
ivation)
block5b_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5b_expand_activation[0][0]
D) ']
block5b_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5b_dwconv[0][0]']
)
block5b_activation (Activation (None, 14, 14, 672) 0 ['block5b_bn[0][0]']
)
block5b_se_squeeze (GlobalAver (None, 672) 0 ['block5b_activation[0][0]']
agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5b_se_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5b_se_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5b_se_reduce[0][0]']
block5b_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5b_activation[0][0]',
'block5b_se_expand[0][0]']
block5b_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5b_se_excite[0][0]']
block5b_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5b_project_conv[0][0]']
lization)
block5b_drop (Dropout) (None, 14, 14, 112) 0 ['block5b_project_bn[0][0]']
block5b_add (Add) (None, 14, 14, 112) 0 ['block5b_drop[0][0]',
'block5a_project_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5b_add[0][0]']
block5c_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5c_expand_conv[0][0]']
ization)
block5c_expand_activation (Act (None, 14, 14, 672) 0 ['block5c_expand_bn[0][0]']
ivation)
block5c_dwconv (DepthwiseConv2 (None, 14, 14, 672) 16800 ['block5c_expand_activation[0][0]
D) ']
block5c_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5c_dwconv[0][0]']
)
block5c_activation (Activation (None, 14, 14, 672) 0 ['block5c_bn[0][0]']
)
block5c_se_squeeze (GlobalAver (None, 672) 0 ['block5c_activation[0][0]']
agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5c_se_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5c_se_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5c_se_reduce[0][0]']
block5c_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5c_activation[0][0]',
'block5c_se_expand[0][0]']
block5c_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5c_se_excite[0][0]']
block5c_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5c_project_conv[0][0]']
lization)
block5c_drop (Dropout) (None, 14, 14, 112) 0 ['block5c_project_bn[0][0]']
block5c_add (Add) (None, 14, 14, 112) 0 ['block5c_drop[0][0]',
'block5b_add[0][0]']
block6a_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5c_add[0][0]']
block6a_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block6a_expand_conv[0][0]']
ization)
block6a_expand_activation (Act (None, 14, 14, 672) 0 ['block6a_expand_bn[0][0]']
ivation)
block6a_dwconv_pad (ZeroPaddin (None, 17, 17, 672) 0 ['block6a_expand_activation[0][0]
g2D) ']
block6a_dwconv (DepthwiseConv2 (None, 7, 7, 672) 16800 ['block6a_dwconv_pad[0][0]']
D)
block6a_bn (BatchNormalization (None, 7, 7, 672) 2688 ['block6a_dwconv[0][0]']
)
block6a_activation (Activation (None, 7, 7, 672) 0 ['block6a_bn[0][0]']
)
block6a_se_squeeze (GlobalAver (None, 672) 0 ['block6a_activation[0][0]']
agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block6a_se_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block6a_se_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block6a_se_reduce[0][0]']
block6a_se_excite (Multiply) (None, 7, 7, 672) 0 ['block6a_activation[0][0]',
'block6a_se_expand[0][0]']
block6a_project_conv (Conv2D) (None, 7, 7, 192) 129024 ['block6a_se_excite[0][0]']
block6a_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6a_project_conv[0][0]']
lization)
block6b_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6a_project_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6b_expand_conv[0][0]']
ization)
block6b_expand_activation (Act (None, 7, 7, 1152) 0 ['block6b_expand_bn[0][0]']
ivation)
block6b_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6b_expand_activation[0][0]
D) ']
block6b_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6b_dwconv[0][0]']
)
block6b_activation (Activation (None, 7, 7, 1152) 0 ['block6b_bn[0][0]']
)
block6b_se_squeeze (GlobalAver (None, 1152) 0 ['block6b_activation[0][0]']
agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6b_se_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6b_se_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6b_se_reduce[0][0]']
block6b_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6b_activation[0][0]',
'block6b_se_expand[0][0]']
block6b_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6b_se_excite[0][0]']
block6b_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6b_project_conv[0][0]']
lization)
block6b_drop (Dropout) (None, 7, 7, 192) 0 ['block6b_project_bn[0][0]']
block6b_add (Add) (None, 7, 7, 192) 0 ['block6b_drop[0][0]',
'block6a_project_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6b_add[0][0]']
block6c_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6c_expand_conv[0][0]']
ization)
block6c_expand_activation (Act (None, 7, 7, 1152) 0 ['block6c_expand_bn[0][0]']
ivation)
block6c_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6c_expand_activation[0][0]
D) ']
block6c_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6c_dwconv[0][0]']
)
block6c_activation (Activation (None, 7, 7, 1152) 0 ['block6c_bn[0][0]']
)
block6c_se_squeeze (GlobalAver (None, 1152) 0 ['block6c_activation[0][0]']
agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6c_se_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6c_se_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6c_se_reduce[0][0]']
block6c_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6c_activation[0][0]',
'block6c_se_expand[0][0]']
block6c_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6c_se_excite[0][0]']
block6c_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6c_project_conv[0][0]']
lization)
block6c_drop (Dropout) (None, 7, 7, 192) 0 ['block6c_project_bn[0][0]']
block6c_add (Add) (None, 7, 7, 192) 0 ['block6c_drop[0][0]',
'block6b_add[0][0]']
block6d_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6c_add[0][0]']
block6d_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6d_expand_conv[0][0]']
ization)
block6d_expand_activation (Act (None, 7, 7, 1152) 0 ['block6d_expand_bn[0][0]']
ivation)
block6d_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 28800 ['block6d_expand_activation[0][0]
D) ']
block6d_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6d_dwconv[0][0]']
)
block6d_activation (Activation (None, 7, 7, 1152) 0 ['block6d_bn[0][0]']
)
block6d_se_squeeze (GlobalAver (None, 1152) 0 ['block6d_activation[0][0]']
agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6d_se_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6d_se_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6d_se_reduce[0][0]']
block6d_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6d_activation[0][0]',
'block6d_se_expand[0][0]']
block6d_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6d_se_excite[0][0]']
block6d_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6d_project_conv[0][0]']
lization)
block6d_drop (Dropout) (None, 7, 7, 192) 0 ['block6d_project_bn[0][0]']
block6d_add (Add) (None, 7, 7, 192) 0 ['block6d_drop[0][0]',
'block6c_add[0][0]']
block7a_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6d_add[0][0]']
block7a_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block7a_expand_conv[0][0]']
ization)
block7a_expand_activation (Act (None, 7, 7, 1152) 0 ['block7a_expand_bn[0][0]']
ivation)
block7a_dwconv (DepthwiseConv2 (None, 7, 7, 1152) 10368 ['block7a_expand_activation[0][0]
D) ']
block7a_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block7a_dwconv[0][0]']
)
block7a_activation (Activation (None, 7, 7, 1152) 0 ['block7a_bn[0][0]']
)
block7a_se_squeeze (GlobalAver (None, 1152) 0 ['block7a_activation[0][0]']
agePooling2D)
block7a_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block7a_se_squeeze[0][0]']
block7a_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block7a_se_reshape[0][0]']
block7a_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block7a_se_reduce[0][0]']
block7a_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block7a_activation[0][0]',
'block7a_se_expand[0][0]']
block7a_project_conv (Conv2D) (None, 7, 7, 320) 368640 ['block7a_se_excite[0][0]']
block7a_project_bn (BatchNorma (None, 7, 7, 320) 1280 ['block7a_project_conv[0][0]']
lization)
top_conv (Conv2D) (None, 7, 7, 1280) 409600 ['block7a_project_bn[0][0]']
top_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['top_conv[0][0]']
top_activation (Activation) (None, 7, 7, 1280) 0 ['top_bn[0][0]']
global_average_pooling2d_1 (Gl (None, 1280) 0 ['top_activation[0][0]']
obalAveragePooling2D)
dropout_3 (Dropout) (None, 1280) 0 ['global_average_pooling2d_1[0][0
]']
dense_3 (Dense) (None, 1024) 1311744 ['dropout_3[0][0]']
dropout_4 (Dropout) (None, 1024) 0 ['dense_3[0][0]']
dense_4 (Dense) (None, 512) 524800 ['dropout_4[0][0]']
dropout_5 (Dropout) (None, 512) 0 ['dense_4[0][0]']
dense_5 (Dense) (None, 256) 131328 ['dropout_5[0][0]']
batch_normalization_1 (BatchNo (None, 256) 1024 ['dense_5[0][0]']
rmalization)
class_op (Dense) (None, 196) 50372 ['batch_normalization_1[0][0]']
reg_op (Dense) (None, 4) 5124 ['dropout_3[0][0]']
==================================================================================================
Total params: 6,073,963
Trainable params: 6,031,428
Non-trainable params: 42,535
__________________________________________________________________________________________________
optimizerVar = Adam(lr=0.0001)
final_model_2.compile(optimizer=optimizerVar,
loss={'reg_op':'mse', 'class_op':'categorical_crossentropy'},
metrics={'reg_op':[IoU], 'class_op':['accuracy']})
C:\Users\adity\miniconda3\envs\capstone\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:114: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. super().__init__(name, **kwargs)
#Create train and test generator
batchsize = 16
train_generator = batch_generator(train_df, batch_size=batchsize, model_used='efficientnet') #batchsize can be changed
test_generator = batch_generator(test_df, batch_size=batchsize, model_used='efficientnet')
history_2 = final_model_2.fit(train_generator,
epochs=20,
steps_per_epoch= train_df.shape[0]//batchsize,
validation_data=test_generator,
validation_steps = test_df.shape[0]//batchsize)
Epoch 1/20 509/509 [==============================] - 215s 401ms/step - loss: 5.4026 - class_op_loss: 5.3692 - reg_op_loss: 0.0334 - class_op_accuracy: 0.0255 - reg_op_IoU: 0.5357 - val_loss: 5.0016 - val_class_op_loss: 4.9788 - val_reg_op_loss: 0.0228 - val_class_op_accuracy: 0.0574 - val_reg_op_IoU: 0.5929 Epoch 2/20 509/509 [==============================] - 203s 400ms/step - loss: 4.3734 - class_op_loss: 4.3563 - reg_op_loss: 0.0171 - class_op_accuracy: 0.0602 - reg_op_IoU: 0.6263 - val_loss: 3.6661 - val_class_op_loss: 3.6521 - val_reg_op_loss: 0.0139 - val_class_op_accuracy: 0.0595 - val_reg_op_IoU: 0.6538 Epoch 3/20 509/509 [==============================] - 203s 400ms/step - loss: 3.1905 - class_op_loss: 3.1771 - reg_op_loss: 0.0134 - class_op_accuracy: 0.0596 - reg_op_IoU: 0.6479 - val_loss: 2.9690 - val_class_op_loss: 2.9578 - val_reg_op_loss: 0.0112 - val_class_op_accuracy: 0.0615 - val_reg_op_IoU: 0.6667 Epoch 4/20 509/509 [==============================] - 203s 400ms/step - loss: 2.8898 - class_op_loss: 2.8787 - reg_op_loss: 0.0111 - class_op_accuracy: 0.0648 - reg_op_IoU: 0.6737 - val_loss: 2.8459 - val_class_op_loss: 2.8365 - val_reg_op_loss: 0.0094 - val_class_op_accuracy: 0.0661 - val_reg_op_IoU: 0.6967 Epoch 5/20 509/509 [==============================] - 202s 396ms/step - loss: 2.8333 - class_op_loss: 2.8244 - reg_op_loss: 0.0090 - class_op_accuracy: 0.0581 - reg_op_IoU: 0.6961 - val_loss: 2.8131 - val_class_op_loss: 2.8051 - val_reg_op_loss: 0.0080 - val_class_op_accuracy: 0.0585 - val_reg_op_IoU: 0.7150 Epoch 6/20 509/509 [==============================] - 203s 399ms/step - loss: 2.8066 - class_op_loss: 2.7986 - reg_op_loss: 0.0080 - class_op_accuracy: 0.0695 - reg_op_IoU: 0.7109 - val_loss: 2.8056 - val_class_op_loss: 2.7985 - val_reg_op_loss: 0.0071 - val_class_op_accuracy: 0.0650 - val_reg_op_IoU: 0.7234 Epoch 7/20 509/509 [==============================] - 204s 401ms/step - loss: 2.8017 - class_op_loss: 2.7941 - reg_op_loss: 0.0076 - class_op_accuracy: 0.0624 - reg_op_IoU: 0.7177 - val_loss: 2.7916 - val_class_op_loss: 2.7853 - val_reg_op_loss: 0.0064 - val_class_op_accuracy: 0.0656 - val_reg_op_IoU: 0.7389 Epoch 8/20 509/509 [==============================] - 202s 398ms/step - loss: 2.7956 - class_op_loss: 2.7887 - reg_op_loss: 0.0069 - class_op_accuracy: 0.0613 - reg_op_IoU: 0.7281 - val_loss: 2.8261 - val_class_op_loss: 2.8202 - val_reg_op_loss: 0.0060 - val_class_op_accuracy: 0.0646 - val_reg_op_IoU: 0.7482 Epoch 9/20 509/509 [==============================] - 204s 400ms/step - loss: 2.7932 - class_op_loss: 2.7867 - reg_op_loss: 0.0066 - class_op_accuracy: 0.0596 - reg_op_IoU: 0.7337 - val_loss: 2.8004 - val_class_op_loss: 2.7940 - val_reg_op_loss: 0.0064 - val_class_op_accuracy: 0.0674 - val_reg_op_IoU: 0.7406 Epoch 10/20 509/509 [==============================] - 204s 400ms/step - loss: 2.7913 - class_op_loss: 2.7847 - reg_op_loss: 0.0066 - class_op_accuracy: 0.0565 - reg_op_IoU: 0.7366 - val_loss: 2.8337 - val_class_op_loss: 2.8255 - val_reg_op_loss: 0.0082 - val_class_op_accuracy: 0.0641 - val_reg_op_IoU: 0.7431 Epoch 11/20 509/509 [==============================] - 206s 406ms/step - loss: 2.7908 - class_op_loss: 2.7844 - reg_op_loss: 0.0065 - class_op_accuracy: 0.0646 - reg_op_IoU: 0.7397 - val_loss: 2.7991 - val_class_op_loss: 2.7926 - val_reg_op_loss: 0.0065 - val_class_op_accuracy: 0.0641 - val_reg_op_IoU: 0.7554 Epoch 12/20 509/509 [==============================] - 204s 402ms/step - loss: 2.7914 - class_op_loss: 2.7850 - reg_op_loss: 0.0064 - class_op_accuracy: 0.0646 - reg_op_IoU: 0.7409 - val_loss: 2.7845 - val_class_op_loss: 2.7789 - val_reg_op_loss: 0.0057 - val_class_op_accuracy: 0.0637 - val_reg_op_IoU: 0.7579 Epoch 13/20 509/509 [==============================] - 202s 396ms/step - loss: 2.7889 - class_op_loss: 2.7829 - reg_op_loss: 0.0060 - class_op_accuracy: 0.0648 - reg_op_IoU: 0.7468 - val_loss: 2.7989 - val_class_op_loss: 2.7939 - val_reg_op_loss: 0.0050 - val_class_op_accuracy: 0.0611 - val_reg_op_IoU: 0.7738 Epoch 14/20 509/509 [==============================] - 205s 403ms/step - loss: 2.7874 - class_op_loss: 2.7820 - reg_op_loss: 0.0053 - class_op_accuracy: 0.0608 - reg_op_IoU: 0.7589 - val_loss: 2.7827 - val_class_op_loss: 2.7788 - val_reg_op_loss: 0.0039 - val_class_op_accuracy: 0.0693 - val_reg_op_IoU: 0.7860 Epoch 15/20 509/509 [==============================] - 205s 403ms/step - loss: 2.7828 - class_op_loss: 2.7776 - reg_op_loss: 0.0051 - class_op_accuracy: 0.0652 - reg_op_IoU: 0.7619 - val_loss: 2.7955 - val_class_op_loss: 2.7913 - val_reg_op_loss: 0.0043 - val_class_op_accuracy: 0.0624 - val_reg_op_IoU: 0.7796 Epoch 16/20 509/509 [==============================] - 207s 407ms/step - loss: 2.7850 - class_op_loss: 2.7801 - reg_op_loss: 0.0049 - class_op_accuracy: 0.0626 - reg_op_IoU: 0.7681 - val_loss: 2.7804 - val_class_op_loss: 2.7771 - val_reg_op_loss: 0.0033 - val_class_op_accuracy: 0.0586 - val_reg_op_IoU: 0.8028 Epoch 17/20 509/509 [==============================] - 203s 399ms/step - loss: 2.7841 - class_op_loss: 2.7794 - reg_op_loss: 0.0047 - class_op_accuracy: 0.0609 - reg_op_IoU: 0.7728 - val_loss: 2.7805 - val_class_op_loss: 2.7762 - val_reg_op_loss: 0.0043 - val_class_op_accuracy: 0.0620 - val_reg_op_IoU: 0.7932 Epoch 18/20 509/509 [==============================] - 203s 398ms/step - loss: 2.7865 - class_op_loss: 2.7819 - reg_op_loss: 0.0046 - class_op_accuracy: 0.0636 - reg_op_IoU: 0.7759 - val_loss: 2.7920 - val_class_op_loss: 2.7877 - val_reg_op_loss: 0.0043 - val_class_op_accuracy: 0.0620 - val_reg_op_IoU: 0.7919 Epoch 19/20 509/509 [==============================] - 204s 402ms/step - loss: 2.7864 - class_op_loss: 2.7817 - reg_op_loss: 0.0047 - class_op_accuracy: 0.0623 - reg_op_IoU: 0.7770 - val_loss: 2.7851 - val_class_op_loss: 2.7816 - val_reg_op_loss: 0.0035 - val_class_op_accuracy: 0.0615 - val_reg_op_IoU: 0.7988 Epoch 20/20 509/509 [==============================] - 206s 405ms/step - loss: 2.7860 - class_op_loss: 2.7816 - reg_op_loss: 0.0044 - class_op_accuracy: 0.0661 - reg_op_IoU: 0.7841 - val_loss: 2.7837 - val_class_op_loss: 2.7802 - val_reg_op_loss: 0.0035 - val_class_op_accuracy: 0.0596 - val_reg_op_IoU: 0.8117
# final_model_2.save('./eff_model_rcnn.h5')
final_model_2.save_weights('./eff_model_rcnn.h5')
acc = history_2.history['class_op_accuracy']
val_acc = history_2.history['val_class_op_accuracy']
iou = history_2.history['reg_op_IoU']
val_iou = history_2.history['val_reg_op_IoU']
epochs_range = range(20)
plt.figure(figsize=(16, 8))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, iou, label='Training IOU')
plt.plot(epochs_range, val_iou, label='Validation IOU')
plt.legend(loc='upper right')
plt.title('Training and Validation IOU')
plt.show()
Insights:
1) Accuracy Plot:
- Training Accuracy: The training accuracy increases steadily over epochs, indicating that the model is learning from the training data and improving its ability to classify objects accurately.
- Validation Accuracy: Initially, the validation accuracy shows an increasing trend, suggesting that the model generalizes well to unseen data. However, it seems to fluctuate later on, which could indicate some instability or overfitting.
2) IOU Plot:
- Training IOU: The training IOU increases over epochs, indicating that the model is improving its ability to localize objects accurately.
- Validation IOU: Similar to validation accuracy, the validation IOU shows fluctuations, possibly indicating overfitting or instability.
3) Overall Insights:
- The model demonstrates an overall positive trend in both training accuracy and training IOU, indicating effective learning and improvement in object classification and localization.
- However, fluctuations in validation accuracy and IOU suggest potential issues with model generalization or stability, which may require further investigation and refinement of the model architecture or training process.
def predict_and_draw(image_num, df):
#Load image
img = tf.keras.preprocessing.image.load_img(df.loc[image_num, 'path'])
w, h = img.size
#Prepare input for model
#1. Resize image
img_resized = img.resize((img_size, img_size))
#2. Conver to array and make it a batch of 1
input_array = tf.keras.preprocessing.image.img_to_array(img_resized)
input_array = np.expand_dims(input_array, axis=0)
#3. Normalize image data
input_array = tf.keras.applications.efficientnet.preprocess_input(input_array)
#Prediction
pred = final_model_2.predict(input_array)
pred_1 = EffNet_model.predict(input_array)
#Get classification and regression predictions
label_pred, bbox_pred = pred_1, pred[1][0]
#Get Label with highest probability
pred_class = label_class_dict[np.argmax(label_pred)]
#Read actual label and bounding box
act_class = df.loc[image_num, 'Image class']
act_class = df.loc[image_num, 'label']
xmin, ymin, xmax, ymax = df.loc[image_num, ['x_min', 'y_min', 'x_max', 'y_max']]
print('Real Label :', act_class, '\nPredicted Label: ', pred_class)
#Draw bounding boxes - Actual (Red) and Predicted(Green)
img = cv2.imread(df.loc[image_num, 'path'])
#Draw actual bounding box - Red
img = cv2.rectangle(img, (xmin, ymin),
(xmax, ymax), (0,0,255), 3)
#Draw predicted bounding box - Green
img = cv2.rectangle(img, (int(bbox_pred[0]*w), int(bbox_pred[1]*h)),
(int((bbox_pred[0]+bbox_pred[2])*w), int((bbox_pred[1]+bbox_pred[3])*h)), (0,255,0), 3
)
#Display the picture
img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
plt.imshow(img)
plt.show()
#Predict on Test Dataset
for i in range(0,5):
image_num = np.random.randint(0, test_df.shape[0])
predict_and_draw(image_num, test_df)
1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 31ms/step Real Label : Ford Mustang Convertible 2007 Predicted Label: Dodge Ram Pickup 3500 Quad Cab 2009
1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 31ms/step Real Label : Buick Rainier SUV 2007 Predicted Label: BMW 3 Series Sedan 2012
1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step Real Label : Acura TL Type-S 2008 Predicted Label: Chevrolet Malibu Hybrid Sedan 2010
1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step Real Label : Chevrolet Monte Carlo Coupe 2007 Predicted Label: Chevrolet Corvette Convertible 2012
1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step Real Label : Acura Integra Type R 2001 Predicted Label: Chrysler 300 SRT-8 2010
testData_generator = batch_generator(test_df, batch_size=8, model_used='efficientnet')
print('Evaluating')
# Re-evaluate the model
test_results_2 = final_model_2.evaluate(testData_generator, verbose=1, batch_size = 16, steps = 509, return_dict = True)
acc = test_results_2['reg_op_IoU']
print("Restored model, IOU: {:5.2f}%".format(100 * acc))
Evaluating 509/509 [==============================] - 33s 65ms/step - loss: 2.7863 - class_op_loss: 2.7829 - reg_op_loss: 0.0034 - class_op_accuracy: 0.0570 - reg_op_IoU: 0.8102 Restored model, IOU: 81.02%
Pickling the models for future use¶
import pickle
pickle.dump(model, open('model_0.pkl', 'wb'))
pickle.dump(model, open('model_1.pkl', 'wb'))
pickle.dump(model, open('model_2.pkl', 'wb'))
pickle.dump(model, open('ResNet_model.pkl', 'wb'))
pickle.dump(model, open('MobileNet_model.pkl', 'wb'))
pickle.dump(model, open('EffNet_model.pkl', 'wb'))
pickle.dump(model, open('final_model.pkl', 'wb')) # MobileNet with Regression
pickle.dump(model, open('final_model_2.pkl', 'wb')) # EfficientNet with Regression
Conclusion¶
We built a number of model architectures using different methodologies ranging from basic CNN models to using transfer learning in combination with Faster RCNN. Out of all the models, we found mobile net with Fast RCNN to perform the best, providing a good balance between accuracy of the class the showing good results in the regression part having very similar bounding box with the actual one. In future, we will me trying my more advance algorithms like YOLOv5 and SSD as well.